Subscribe to Synced Global AI Weekly

Hot Chips | A California Startup Has Built an AI Chip as Big as a Notebook. Why?
California AI startup Cerebras Systems has introduced its Cerebras Wafer Scale Engine (WSE), the world’s largest-ever chip built for neural network processing. The 16nm WSE is a 46,225 mm2 silicon chip featuring 1.2 trillion transistors, 400,000 AI optimized cores, 18 Gigabytes of on-chip memory, 9 petabyte/s memory bandwidth, and 100 petabyte/s fabric bandwidth.

Open AI Releases 774 Million Parameter GPT-2 Language Model
Open AI releases the 774 million parameter GPT-2 language model after the release of their small 124M model in February, the staged release of a medium 355M model in May, and subsequent research with partners and the AI community into the model’s potential for misuse and societal benefit.
(Open AI)

Waymo Open Dataset: Sharing Self-Driving Data for Research
Waymo is sharing the Waymo Open Dataset, a high-quality multimodal sensor dataset for autonomous driving. Waymo researchers believe it is one of the largest, richest, and most diverse self-driving datasets ever released.


Turbo, An Improved Rainbow Colourmap for Visualization
A research team from Google AI has introduced Turbo, a new colourmap that has the desirable properties of Jet while also addressing some of its shortcomings, such as false detail, banding and colour blindness ambiguity. Turbo was hand-crafted and fine-tuned to be effective for a variety of visualization tasks.
(Google AI)

On The Variance of The Adaptive Learning Rate and Beyond
Researchers propose RAdam, a new variant of Adam, by introducing a term to rectify the variance of the adaptive learning rate. Extensive experiment results on image classification, language modeling, and neural machine translation demonstrate the effectiveness and robustness of the proposed method.
(UIUC & Georgia Tech & Microsoft)

A Critique of Pure Learning and What Artificial Neural Networks Can Learn from Animal Brains
Researchers argue that most animal behavior is not the result of clever learning algorithms — supervised or unsupervised — but rather is encoded in the genome. Specifically, animals are born with highly structured brain connectivity, which enables them to learn very rapidly.

You May Also Like

MIT & Adobe Introduce Real Time AR Tool for Storytelling
Researchers from MIT Media Lab and Adobe Research have introduced a real-time interactive augmented video system that enables presenters to use their bodies as storytelling tools by linking gestures to illustrative virtual graphic elements.

Huawei’s First Commercial AI Chip Doubles the Training Performance of Nvidia’s Flagship GPU
Billed as the single chip with the greatest computing density, Ascend 910 delivers performance of up to 256 teraFLOPS under FP16 and 512 teraOPS under IN8 with declared max power consumption of 310W.

Global AI Events

September 10–12: The AI Summit (Part of TechXLR8) in Singapore

September 24-28: Microsoft Ignite in Orlando, United States

October 23-25: NVIDIA’s GPU Technology Conference (GTC) in Munich, Germany

October 27-November 3: International Conference on Computer Vision (ICCV) in Seoul, South Korea

Global AI Opportunities

DeepMind Internship Program

NVIDIA Graduate Fellowships

DeepMind Scholarship: Access to Science

LANDING AI is recruiting

Stanford HAI is Recruiting

OpenAI Seeking Software Engineers and Deep Learning Researchers

DeepMind is Recruiting

Stay tight with AI!
Subscribe to Synced Global AI Weekly