The DALL-E 2 neural network developed by OpenAI seems to have invented its own language. This neural network can generate images from text, testing began in April this year.

However, users have discovered that the platform can create images with subtitles and captions. At first glance, these were just meaningless sets of letters, but that turned out not to be the case. If you enter the same letters, the neural network will generate images that can be attributed to one of the categories.
For example, the query “Apoploe vesrreaitais” returns birds, the phrase Contarra ccetnxniams luryca tanniounons” returns pests, and the long phrase “Apoploe vesrreaitais eating Contarra csetnxniams luryca tanniounons” causes the neural network to create images of birds that eat insects.

If you submit the question “Two whales talk about food”, you will get a photo with the caption “Wa ch zod rea”. It seems that this is really a ‘language’ that allows the neural network to process data faster. Attention!
Source: VG Times