AI trends according to Chinese intelligence

Recently posted classified documents reveals that AGI is close, human-generated data for AI training in insufficient, quantum computers will be used for AI and software coding will rely less on engineers

In this post we share an exclusive document which aims to explain what are upcoming trends with Artificial Intelligence according to the intelligence service agencies of the world's second largest economy, which is China. The document has been only published on Reddit via an anonymous source.

🧠
If you would like to earn 5% monthly passive interest on your BTC, ETH or USDT, look no further! Now this is possible with Crypto4Winners!

Foreword:

A joint research group formed by China's Ministry of State Security and China's Institute of Science, Technology and Cybersecurity published an internal report assessing AI development trends in 2024 on December 29, 2023 Beijing time. Considering the audience of the report - high-level Chinese officials who do not necessarily have the relevant technical knowledge base - the producers of the report used plain language. My understanding is that the report has been recognized by high-level Chinese officials. The original version of the report is in Chinese, translated by me, Liu Jidong.

Trend â… : Moving from AI Big Models to General Artificial Intelligence

Open AI is training the next generation of AI, Q*, which is characterized by intelligence that does not come from human activity data and its ability to modify its own code to adapt to more complex learning tasks. It has the ability to iterate on itself, and given that AI iterates much faster than can be imagined in virtual environments, it will be able to autonomously develop AIs that can outperform humans in various fields in the future, and can be used to solve complex scientific problems. For example, artificial fusion control, nano or superconducting materials screening, anti-cancer drug research and development of these problems usually need to spend decades of human researchers to find new solutions, part of the cutting-edge areas of the amount of research has exceeded the limits of human beings, and the general artificial intelligence in their own virtual world has unlimited time and energy, which makes it easy to virtualize some of the tasks will become a replacement for human researchers.

However, the research group suggests that the optimistic statements of Silicon Valley giants should not be overestimated, as there have been three AI winters in the history of the development of artificial intelligence, which are full of examples of grand technological visions, which have fizzled out due to the limitation of various aspects.

Trend â…ˇ: Synthetic data to break the AI training data bottleneck

Data bottleneck refers to the limited availability of high-quality data that can be used to train AI, synthetic data is based on imitation of real data, synthesized by machine learning models using mathematical and statistical science principles, which is expected to break the data bottleneck.

Studies have shown that models have to scale to at least 62 billion participant counts before they can be trained to chain-of-mind capabilities, both to perform step-by-step logical reasoning, but the reality is that there is not as much non-duplicated human-generated training-ready high-quality data available to date. The need for high quality data can be met by using high quality synthetic data generated by generative AI.

Data security considerations are also an important reason why synthetic data is gaining traction. In recent years, countries have introduced stricter data security protection laws, making it more cumbersome to train AI using objectively human-generated data, which may not only contain personal information, but also much of which is protected by copyrights, and which, if desensitization is considered, faces challenges in terms of screening and recognition accuracy. For now, using synthetic data is the most affordable option.

Another advantage of using synthetic data is that training with human data can lead to the AI learning harmful things, such as the use of everyday objects, the making of bombs, the methods of controlling chemicals, and many bad habits that the AI should not have, such as laziness in task execution like a human being, lying to please users, and bias and discrimination. If synthetic data is used instead, so that the AI is trained with as little exposure to harmful content as possible, it is expected to overcome the above drawbacks attached to training with human data.

The research group suggests that ensuring that companies and organizations responsibly produce synthetic data training sets that are consistent with Chinese culture and values, yet comparable in scale and technology to Western English-language online sources, will be a challenging issue for China. Another major change brought about by synthetic data is that human data will be generated, stored and used in accordance with the laws and order of human society, including the maintenance of national data security, confidentiality of commercial data and respect for personal data privacy, while the synthetic data required for AI training should be managed by a different set of standards.

Trend â…˘: Quantum computers will be applied to artificial intelligence

Artificial intelligence has always had the worry of insufficient arithmetic power, in this context, the discussion of the application of quantum computers in the field of artificial intelligence becomes a promising future solution.

First, most of the algorithms in the field of artificial intelligence belong to the category of parallel computing, and quantum computers are good at parallel computing, because it can calculate and store the two states of zero and one at the same time without the need to consume additional computational resources as electronic computers do, such as connecting multiple computational units in series or calculating the task in time side by side, and the more complex the task, the more advantageous quantum computing.

Second, the hardware required to run artificial intelligence is also very suitable for the introduction of large quantum computers, both need to be installed in a highly integrated computing center, with a specialized technical team to manage the support.

However, the research group believes that the future of quantum computers will not completely replace electronic computers, it is more likely that quantum computers and electronic computers in different application scenarios to play their respective strengths, to achieve synergistic development, not only to enhance the arithmetic power, but also take into account the cost and feasibility.

Trend â…Ł: AI agents and no-code software development will bring a huge impact

As of now there are at least nearly 200 million people around the world using AI large-scale models, but people are no longer satisfied with sitting in front of the computer and chatting with AI, but began to develop tools that can automatically send prompts to the AI according to the needs of the task, when the automatic prompting tools and large-scale models are combined, the AI agent is born.

AI agents can automatically create websites based on relatively vague demand prompts, automatically complete a variety of text and table processing work that needs to be done using office software, and even automatically summarize and generate analysis papers based on existing paper data. It should be noted that AI agents will have an impact on many existing jobs as companies may try to hire fewer people to perform the same tasks, and this disruption of existing economic structures brought about by innovation is called creative destruction. As AI agents replace a large number of tasks that require fewer computer skills to perform, this forced re-employment of the workforce will have to adapt to the new labor market demands, which is destined to be a longer process of accompanying pain.

Although generative AI may eliminate a number of traditional digital jobs, it also opens a window while closing a door, which is the no-code software development, the current programming aids based on AI macromodels have reached a new stage of development, which is capable of generating software or web page code based on very vague instructions from the user, which undoubtedly greatly reduces the threshold of services for developing IT, and is able to meet the the innovative nature of many people's needs, and it will become the windfall of Internet innovation.

The research group suggests that the Chinese government should change its mindset to balance market regulation and promotion of innovation, lower the threshold of registration and financing in the process of digital innovation, and explore new policies on copyright and patent protection that are more conducive to the protection of innovation.


Thank you for reading!

If you would like to know how you can earn safely 4-5% monthly on your BTC, ETH, USDT/C, then you might read our review about Crypto4Winners too!

Connect with us!

If you like our work in BitYields, consider getting in touch with us at: