AI is ‘intimidating,’ ‘dangerous’: Members of Congress reveal how much they know about artificial intelligence

By admin

March 31, 2023

Calls to regulate artificial intelligence are growing on Capitol Hill following a dire warning from tech giants. But many lawmakers also admit they don’t know much more about the technology than the average American.

“I’ve had ChatGPT demonstrated to me by a friend, and its capabilities are kind of intimidating,” Sen. Cynthia Lummis told Fox News. “They’re impressive, but the potential for mischief and misuse are high.”

ARTIFICIAL INTELLIGENCE ‘GODFATHER’ ON AI POSSIBLY WIPING OUT HUMANITY: ‘IT’S NOT INCONCEIVABLE’

Tech industry leaders including Elon Musk and Steve Wozniak signed an open letter calling on AI developers to pause training systems more powerful than GPT-4 for at least six months. 

“Contemporary AI systems are now becoming human-competitive at general tasks,” posing many risks to society, the letter warns. It asks AI labs to work together to develop safety protocols for advanced AI design.

If companies won’t willingly take a pause, the letter says government should “step in and institute a moratorium.” 

Sen. Lindsey Graham said he’s “not very” familiar with the platforms but is “amazed” by what he sees.

“This is an area of life that needs to have some guidance and regulatory oversight,” the South Carolina Republican said.

Rep. Marjorie Taylor Greene said she is “very familiar” with ChatGPT following a hearing in the cybersecurity subcommittee.

“Chat is very dangerous,” the Georgia Republican said. “It has a woke leaning. When we asked questions to Chat GPT, the answers were very different given the subject matter. It definitely leaned left, and I think that’s very worrisome.”

Rep. Dan Meuser, who estimated he’s as “familiar with [AI platforms] as most people,” had a sunnier outlook.

“It’s incredibly interesting,” the Pennsylvania Republican said. “It’s innovation, it’s technology, it’s advancement. We’ve got to embrace it.”

To hear more from lawmakers, click here.