boards

本页内容为未名空间相应帖子的节选和存档,一周内的贴子最多显示50字,超过一周显示500字 访问原贴
EE版 - Nvidia, GPU and machine learning (WSJ article)
相关主题
nVIDIA和Marvell这两家公司谁比较好?(大包子感谢!)
RTL Design for ASIC (会tape-out的) vs RTL Design for FPGA, 有什么不同吗?
EE发展前景
看到“可怜的老工程师们”,说起工程师 (转载)
Signal Integrity Characterization Engineer needed. (转载)
Postdoctoral fellowship in GE Healthcare, London, ON, Canad (转载)
Special Issue on Computational Methods and Clinical Applications for Spine Imaging
GE Healthcare Postdoctoral Fellowship (Medical Image Analysis and Visualization)
RA position in ECE at VCU (Spring or Fall 15)
图像处理算法还有前途吗
相关话题的讨论汇总
话题: nvidia话题: gpus话题: gpu话题: learning话题: inc
进入EE版参与讨论
1 (共1页)
r***0
发帖数: 406
1
http://www.wsj.com/articles/new-chips-propel-machine-learning-1
Computer users have long relied on Nvidia Corp.’s technology to paint
virtual worlds on the screen as they gunned down videogame enemies. Now some
researchers are betting it can also help save lives—of real people.
Massachusetts General Hospital recently established a center in Boston that
plans to use Nvidia chips to help an artificial-intelligence system spot
anomalies on CT scans and other medical images, jobs now carried out by
human radiologists. The project, drawing on a database of 10 billion
existing images, is designed to to “train” systems to help doctors detect
cancer, Alzheimer’s and other diseases earlier and more accurately.
“Computers don’t get tired,” said Keith Dreyer, the center’s executive
director and vice chairman of radiology at Mass General. “There is no doubt
that this will change the way we practice health care, and it will clearly
change it for the better.”
The effort is one of many examples illustrating how advances in microchips—
particularly the graphics-processing units pioneered by Nvidia—are fueling
explosive growth in machine learning, a programming approach in which
computers teach themselves without explicit instructions and then make
decisions based on what they’ve learned.
Internet giants such as Google Inc., Facebook Inc., Microsoft Corp., Twitter
Inc. and Baidu Inc. are among the most active, using the chips called GPUs
to let servers study vast quantities of photos, videos, audio files and
posts on social media to improve functions such as search or automated photo
tagging. Some auto makers are exploiting the technology to develop self-
driving cars that sense their surroundings and avoid hazards.
Some companies are betting that GPUs will be overtaken for such purposes by
more specialized chips. Google, in a surprise move, last Wednesday disclosed
that, in addition to Nvidia’s GPUs, it has been using an internally
developed processor for machine learning. Others advocating special-purpose
processors include Movidius, a Silicon Valley startup selling chips it calls
vision processing units, and Nervana Systems, a machine learning service
that plans to move from GPUs to chips of its own design.
“There is no way that existing [chip] architectures will be right in the
long term,” said Jeff Hawkins, co-founder of Numenta, a company started 11
years ago to work on brain-like forms of computing.
For now, Nvidia has a substantial lead in the field, one of several factors
that have doubled the company’s share price in 12 months and pushed its
market value above $24 billion. The company, which continues to benefit from
strong growth in videogames, reported this month that its business selling
GPUs for data centers, rose 62% from a year earlier.
CEO Jen-Hsun Huang, a Taiwan-born executive known for a trademark leather
jacket and a fondness for Tesla electric cars, has emerged as a kind of Pied
Piper for the machine-learning technique known as deep learning. He
attributes Nvidia’s data center growth to the big cloud-computing vendors
moving deep learning from testing into their core services.
“It is now clear that hyperscale companies all around the world are moving
into production,” he said.
The research firm Tractica LLC estimates spending on GPUs as a result of
deep learning projects will grow from $43.6 million in 2015 to $4.1 billion
by 2024, and related software spending by enterprises will increase from $
109 million to $10.4 billion over the same period.
GPUs, also produced by Nvidia rival Advanced Micro Devices Inc., are
especially suited for this work because they can perform many calculations
simultaneously. Where conventional processors are designed to execute
sequences of varied types of instructions, GPUs excel at performing a single
type of calculation many times at once—like applying a color to each pixel
on a computer display to generate an image. To accomplish this, Nvidia’s
latest GPU has 3,584 relatively simple processor cores working in parallel,
compared with one to 22 more complex calculating engines on general-purpose
processors from Intel Corp.
Software engineers discovered that the GPU’s massive parallel processing
was especially useful in deep learning. Instead of starting with a human-
made definition of a face, for example, researchers might show millions of
images of faces to let a computer—sometimes with human feedback—develop
its own definition of what a face looks like. The GPU can study examples
much more quickly than conventional processors, dramatically accelerating
the training phase.
Startup Blue River Technology, a Nervana customer that has adopted GPU
technology, used photographs of crops and weeds to train a camera-equipped
computer system for tractors to decide where to spray herbicide.
“Those machines are making 5,000 see-and-spray decisions a minute,” said
Ben Chostner, the company’s vice president of business development.
But some argue that GPUs simply aren’t as efficient as those designed from
scratch for machine learning. Some companies, like Nervana and Movidius,
emulate the parallelism of GPUs but focus on moving data more quickly and
dispensing with features needed for graphics. Others, like International
Business Machines Corp. with a chip dubbed TrueNorth, have developed chip
designs inspired by the neurons, synapses and other features of the brain.
Mr. Huang said Nvidia was well aware of Google’s development effort. He
attributes Google’s motivation partly to the fact that, two years ago,
Nvidia’s GPUs were better suited for training than the later phase that
exploits the training to make analytical decisions. But Nvidia’s latest GPU
is more than 25 times faster than its predecessor at that work, he said.
s***d
发帖数: 15421
2
狗刚刚在io大会发布了自家的 deeplearn 硬件平台 tpu

some
that
★ 发自iPhone App: ChineseWeb 11

【在 r***0 的大作中提到】
: http://www.wsj.com/articles/new-chips-propel-machine-learning-1
: Computer users have long relied on Nvidia Corp.’s technology to paint
: virtual worlds on the screen as they gunned down videogame enemies. Now some
: researchers are betting it can also help save lives—of real people.
: Massachusetts General Hospital recently established a center in Boston that
: plans to use Nvidia chips to help an artificial-intelligence system spot
: anomalies on CT scans and other medical images, jobs now carried out by
: human radiologists. The project, drawing on a database of 10 billion
: existing images, is designed to to “train” systems to help doctors detect
: cancer, Alzheimer’s and other diseases earlier and more accurately.

1 (共1页)
进入EE版参与讨论
相关主题
图像处理算法还有前途吗
google machine learning chip article
FAB PIE在硅谷找工作的机会多吗
Process Integration Engineer是不是很没前途?
Process Integration Engineer 的career path一般是怎样?
Offer求比较
做Network Coding以后出路怎么办?
英特尔的双核好像跑程序只比单核快30%?
现在digital/ Analog IC找工作前景怎么样啊?
Software Applications Engineer职位/加州
相关话题的讨论汇总
话题: nvidia话题: gpus话题: gpu话题: learning话题: inc