Teaching Computer Architectures for Cognitive Processing

Time: Tuesday, October 24, 2017 - 10:30am - 11:30am
Type: Seminar Series
Presenter: Richard Lethin, President at Reservoir Labs
Room/Office: Becton 035
Location:
Becton Seminar Room
15 Prospect Street
New Haven, CT 06511
United States

Department of Electrical Engineering Seminar

"Teaching Computer Architectures for Cognitive Processing"

Richard Lethin
President at Reservoir Labs

For decades, the idea of building computer architectures for Artificial Intelligence (AI) was controversial, and – yes – disreputable, in both computer architecture and AI. It was accompanied by a consistent record of failure in AI efforts and often overlooked success in innovating features that benefit mainstream computing. In the past two years, Architecture for AI has become one of the most exciting fields, with the Machine Learning (ML) features of processors enabling leaps in capabilities: smart machines, creativity, scientific understanding, and profitability. Hundreds of millions of dollars (venture, industrial, government) are now being invested in building computer architectures for AI. The incredibly high pace of improvement and the breadth and scale of activity related to ML and AI processors make it essential to enhance computer architecture education to include these new chips and address the basic challenge of keeping pace with innovations. A route to teaching about these chips is in the context of historical attempts, which offer design idioms and lessons from failure and illuminate paths forward. This talk will give an overview of current machine learning chips in the context of historical AI machines and illustrate some promising paths forward in the field.   Richard Lethin is an Associate Professor (Adjunct) in Electrical Engineering at Yale and President at Reservoir Labs. At Reservoir Labs, his team has been working on algorithms and compiler software for advanced computer architectures. Richard received his Ph.D. from MIT, where his group was part of the AI lab and built the J-Machine for AI – a massively parallel computer funded by the DARPA Strategic Computing Initiative, amidst concerns about the Japanese 5th Generation Computer Architecture Project. Richard received his B.S. degree in Electrical Engineering from Yale and joined a Yale computer architecture spinout called Multiflow Computer that developed the first Very Long Instruction Word (VLIW) computer. 

Tuesday, October, 24th  10:30-11:30 AM
Becton Seminar Room, 10 Hillhouse Ave. 

Hosted by Professor Rajit Manohar