Select Page

Android: celebrating a big milestone together with you

When I started working at Google in early 2007, it was before Android, before iOS. Mobile was still niche. And while many of us had a sense that mobile was going to be big, I’m not sure we really realized just how big it was going to get. Fast forward to today, and there are now 2 billion monthly active Android devices globally. This is an extraordinarily humbling milestone—and it’s the largest reach of any computing platform of its kind. Today at Google I/O, we celebrated that milestone and showcased a number of ways we’re working to make …Read More

Source:: Android: celebrating a big milestone together with you

The TensorFlow Research Cloud offers researchers the following benefits:

  • Access to Google’s all-new Cloud TPUs that accelerate both training and inference
  • Up to 180 teraflops of floating-point performance per Cloud TPU
  • 64 GB of ultra-high-bandwidth memory per Cloud TPU
  • Familiar TensorFlow programming interfaces

You can sign up here to request to be notified when the TensorFlow Research Cloud application process opens, and you can optionally share more information about your computational needs. We plan …Read More

Source:: Introducing the TensorFlow Research Cloud

News Feed FYI: New Updates to Reduce Clickbait Headlines

By Arun Babu, Engineer, Annie Liu, Engineer, and Jordan Zhang, Engineer

People tell us they don’t like stories that are misleading, sensational or spammy. That includes clickbait headlines that are designed to get attention and lure visitors into clicking on a link. In an effort to support an informed community, we’re always working to determine what stories might have clickbait headlines so we can show them less often.

Last year we made an update to News Feed to reduce stories from sources that consistently post clickbait headlines that withhold and exaggerate information. Today, we are making three updates that build …Read More

Source:: News Feed FYI: New Updates to Reduce Clickbait Headlines

Using Machine Learning to Explore Neural Network Architecture

At Google, we have successfully applied deep learning models to many applications, from Our GoogleNet architecture. Design of this network required many years of careful experimentation and refinement from initial versions of convolutional architectures.

To make this process of designing machine learning models much more accessible, we’ve been exploring ways to automate the design of machine learning models. Among many algorithms we’ve studied, evolutionary algorithms [1] and reinforcement learning algorithms [2] have shown great promise. But in this blog post, …Read More

Source:: Using Machine Learning to Explore Neural Network Architecture

Each module processes inputs and provides transformed representations of those inputs on its outputs (which are, in turn, available for the next level). In the Smart Reply system, and the figure above, the repeated structure has two layers of hierarchy. The first makes each feature useful as a predictor of the final result, and the second combines these features. By definition, the second works at a more abstract representation and considers a wider timescale.

By …Read More

Source:: Efficient Smart Reply, now for Gmail