Nvidia’s Jensen Huang says AI hallucinations are solvable, artificial general intelligence is 5 years away

NVIDIA’s CEO discusses AI advancements and challenges during GTC conference.

Introduction: Nvidia CEO Jensen Huang has shared his thoughts on artificial general intelligence (AGI) and AI hallucinations during the company’s annual GTC developer conference. According to Huang, solving AI hallucinations is achievable through retrieval-augmented generation, while achieving AGI is expected within five years.

Artificial General Intelligence (AGI)

AGI, also called “strong AI,” “full AI,” “human-level AI” or “general intelligent action”, represents a major step forward in AI development.

Unlike narrow AI, focused on specific tasks, AGI can handle various cognitive tasks at or above human levels.

Concerns arise from uncertainty over AGI’s decision-making process and alignment with human values.

 Time Frame for AGI Development

Huang believes predicting AGI development depends on defining AGI precisely.

Examples of potential benchmarks include passing legal, logic, economic, or pre-med tests.

Huang estimates reaching these benchmarks within five years.

AI Hallucinations

AI hallucinations occur when AIs provide false information without verifying facts.

To solve this issue, Huang proposes using “retrieval-augmented generation.”

This involves ensuring answers are researched before providing them, thus preventing incorrect information dissemination.

 As AI technology continues to evolve, companies like Nvidia play a vital role in shaping its progression and addressing emerging issues. With CEO Jensen Huang’s insights, we gain valuable perspectives on the future of AI and its potential impact on society.

Nvidia’s Jensen Huang says AI hallucinations are solvable, artificial general intelligence is 5 years away