GPT

= Generative Pre-trained Transformer = Science fiction has us think of artificial intelligence as something for robots and alien spaceships, but the reality is very different. In this context, an artificial intelligence is quite unlike the natural intelligence exhibited by humans. The phrase that conjures thoughts of robotic assassins and intelligent holographic people is actually artificial general intelligence, or AGI.

The eponymous AI that NovelAI uses is a a lot of different technical terms put together: An Autoregressive Language Model and Generative Pre-trained Transformer.

What does GPT mean?
This will be an oversimplification of sorts. If you're really interested, you should go ahead and consult Wikipedia's page onGPT-3.


 * Autoregressive means what the AI gives you is based on what information you fed into it, and a random element.


 * Language Models are AIs designed to replicate human language for various use cases, such as autocorrection, text prediction, composing letters, writing advertisements, etc.


 * Generative  means the AI creates content, in opposition to a discriminative model, which is used to tell the difference between two content elements.


 * Pre-trainedmean that the AI is first trained by making it read a lot of text. A sample of the text is then shown to the AI, and it is asked what is supposed to appear next. Its performance is measured on how well it managed to "guess" the follow-up text, although some randomness or "loss" is useful as it gives it creativity.


 * Transformer means that the AI processes the entire input text at once, and lends different weight to different content, instead of reading things one word at a time. This allows for faster and more efficient training for language models.

What does this imply?
No, you are not talking to a machine with thoughts and feelings. While NovelAI is very advanced, it is purely designed as a literature generation service. At times, you may feel like the AI understands you, and it may even seem emotional. This is it performing a very convincing facsimile of human literature - exactly what it was designed to do.

The AI is only capable of producing text that looks convincingly human, without understanding language on its own.

As such, NovelAI has no understanding of morality, the fundamental rules of grammar, or even prejudices and biases carried by humankind as a whole. Please keep this in mind when experiencing the narratives presented in your Stories, and remember you can always ban certain tokens from appearing.

Parameters
Think of an AI like a human brain. Inside the brain, you find neurons, and those neurons are connected by synapses.

The number of parameters is the number of synapses. It's a measure of the density of the neural network's connections. The denser it is, the richer the connections, and the broader the scope of the AI's creativity.

The model is, in reality, a vector space and each parameter is a floating point numbers that constitutes a vector. Each Vector is a connection between tokens, and that network of vectors is an attempt to represents human language mathematically.

A model with more parameters takes considerably more Video Memory on the machine's hardware, but is capable of greater creativity and more 'natural' language.

Why Video Memory? Simple: CPUs perform operations linearly, and Floating Point math is costly. GPUs are designed to run lots of tasks in parallel, which is useful for an AI, and often perform better at the types of math that are used by AI networks.

⬆ Return to Page Top

-

Tokens
The AI interprets text by converting it to tokens. Tokens are how the AI sees pieces of text. Much like morphemes, tokens are combined to form words or sentences. Because the AI has no judgment of its own, it relies on evaluating the relationships between tokens based on its training data, then determining the most likely token to come next in the sequence.

Think of it as a huge game of probabilities. There is no winner, but there are more likely answers, and the AI picks from those.

For example, a raincoat is composed of two tokens - the token for rain and the token for coat. When put together, the AI recognizes the pattern as raincoat, and evaluates its training material to figure out what tokens are associated with raincoat. Not all tokens are whole words - some tokens are as simple as punctuation marks, spaces, and even partial words like mah.

The AI is capable of evaluating patterns with up to 2048 tokens in memory. When you hit Send, the Current Context is fed to the AI as tokens, the AI estimates the most likely next word in the sequence, then repeats the process until the Generation is returned.

NovelAI works by identifying links between tokens, thus, including a token will cause it to have a higher chance of appearing. This means that writing, a negative, will cause the AI to still consider   as a possibility.

This is similar to ironic process theory. Instead, you should phrase positively -  - where possible.

Tokenizer Tool
NovelAI includes a built-in tokenizer tool that allows you to see not only the breakdown of tokens used in an input, but also the token IDs, token count, and character count.

This tool is accessed through the main menu, or by clicking on the token count of Memory, Author's Note or Lorebook entries.

⬆ Return to Page Top