Years ago, an audacious Fields medalist outlined a sweeping program that, he claimed, could be used to resolve a major ...
The reason we can gracefully glide on an ice-skating rink or clumsily slip on an icy sidewalk is that the surface of ice is ...
Large language models such as ChatGPT come with filters to keep certain info from getting out. A new mathematical argument ...
Is language core to thought, or a separate process? For 15 years, the neuroscientist Ev Fedorenko has gathered evidence of a ...
Researchers have used metamathematical techniques to show that certain theorems that look superficially distinct are in fact ...
In mathematics, ubiquitous objects called groups display nearly magical powers. Though they’re defined by just a few rules, groups help illuminate an astonishing range of mysteries. They can tell you ...
In cellular automata, simple rules create elaborate structures. Now researchers can start with the structures and reverse-engineer the rules.
Is language core to thought, or a separate process? For 15 years, the neuroscientist Ev Fedorenko has gathered evidence of a language network in the human brain — and has found some parallels to LLMs.
Integer linear programming can help find the answer to a variety of real-world problems. Now researchers have found a much faster way to do it. Inspired by the results of a game-playing neural network ...
Naomi Saphra thinks that most research into language models focuses too much on the finished product. She’s mining the history of their training for insights into why these systems work the way they ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results