It is getting smarter every day -Whough -i processes already similar to human sciences: study

Their powers go beyond the imposition of it.

Artificial intelligence does not seem alone and acts human – ostensibly thinks like us.

Chinese researchers found the first rehearsals that he’s models such as the information of the human mind-like chatgt process, detailing the significant Dystopian discovery in the magazine “Intelligence of Nature Machinery”.

“This provides competitive evidence that the representations of the object in the LLM (large language models), although not identical to human ones, share the fundamental similarities that reflect the main aspect of the key aspects of the key knowledge of conceptual human knowledge,” the team wrote behind the study, which was a collaboration between sciences and the University of Technology of South China. Reported.

The models of it displayed extraordinary categorization powers. Hate C/Peplessimages.com – Stock.adobe.com

The team is reported to have wanted to see if LLM models can “develop human object representations from linguistic and multimodal data

Prior to the study, many technology experts simply assumed that linguistic models like chatgpt imitated human responses through model recognition. Alexphotostock – Stock.adobe.com

To find out if the “bot process” of it reflects our recognition, the researchers had Opennai chatgt-3.5 and the performance of Gemini Pro Google a series of “strange-oo-out” trials, in which articles were given and were charged with the choice of what it reports.

Researchers remain unsure if he understands the importance or emotional value of cats and other subjects at a human level. Antonkhrupinart – Stock.adobe.com

Wonderful, he created 66 conceptual dimensions to list objects.

After comparing this cyber object with the human analysis of the same objects, they found striking similarities between the “perception” of patterns and human recognition – especially when it comes to grouping language.

From this, the researchers revealed that our psychological doppelgangers “develop conceptual representations similar to the man of objects”.

“Further analysis showed a strong approximation between embedded model and patterns of nervous activity” in the region of the brain associated with memory and stage recognition.

Researchers noted that language-based LLMs were a bit of a lack of visual aspect as a spatial form or property.

In the meantime, research has shown that he struggles with the tasks that require deeper levels of human recognition, such as analogical thinking – drawing comparisons between different things to conclude – while it is unclear whether they understand the importance of certain objects or emotional value.

“The current one can distinguish between cat and dog photos, but the essential difference between this ‘recognition’ and human meaning of cats and dogs remains to be discovered,” he Huigiguang, a professor at the Chinese Academy of Sciences (CAS) Institute of Auto.

However, scientists hope that these findings will allow them to develop “the most popular cognitive human systems” that can better cooperate with their brothers with flesh and blood.

#smarter #day #Whough #processes #similar #human #sciences #study
Image Source : nypost.com

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top