Currents 033: Connor Leahy on Deep Learning



Connor Leahy has a wide-ranging chat with Jim about the state & future of Deep Learning…

Connor Leahy has a wide-ranging chat with Jim about the state & future of Deep Learning. They cover the history of EleutherAI, how GPT-3 works, the dynamics & power of scaling laws, ideal sampling rates & sizes for models, data sets, EleutherAI’s opensource GTP-Neo & GTP-NeoXPyTorch vs TensorFlow, TPU’s vs GPU’s, the challenge of benchmarking & evaluations, quadradic bottlenecks, broad GTP-3 applications, Connors thoughts on Jim’s proposed GPT-3 research project, untapped GPT-3 potential, OpenAI’s move away from opensource, alignment, AI safety, the unknown future, and much more.

Connor Leahy is an AI Researcher at German startup Aleph Alpha and founding member of the loose AI research collective EleutherAI. Eleuther is best known for their ongoing efforts to produce a full open source GPT-3 sized language model. Connor currently researches large, general purpose language models and how to align them to human values.