当前位置:网站首页>Tensorflow, danger! Google itself is the one who abandoned it

Tensorflow, danger! Google itself is the one who abandoned it

2022-06-25 03:42:00 QbitAl

Xiao Xiao Abundant color From the Aofei temple
qubits | official account QbitAI

Close harvest 16.6 m Star、 Witness the rise of deep learning TensorFlow, The position is at stake .

And this time , The shock does not come from old rivals PyTorch, It's a rookie JAX.

The latest wave AI In the hot debate , even fast.ai founder Jeremy Howard Both end up saying :

JAX Is gradually replacing TensorFlow This matter , already be widely known 了 . Now it's happening ( At least within Google ).

a26d411a8ba2ccdae825292eb9fdd175.png

LeCun I think that , Fierce competition between deep learning frameworks , Has entered a new stage .

2fe3c65158a6c76e62b15ba6270e3ae2.png

LeCun Express , The original Google TensorFlow Do than Torch More fire . However Meta Of PyTorch After appearance , It is now more popular than TensorFlow 了 .

Now? , Include Google Brain、DeepMind And many external projects , Have started to use JAX.

A typical example is the recent explosion DALL·E Mini, In order to make full use of TPU, The author adopts JAX Programming . Some people sigh after using it :

It's comparable PyTorch It's much faster .

9f0feb208a59fae28531e5c7f090de5d.png

According to the 《 Business Insider 》 According to , Expected in the next few years ,JAX Will overwrite Google all Products using machine learning technology .

So it looks like , Now vigorously promote in-house JAX, It is more like a campaign launched by Google on the framework “ Save your ”.

JAX To come from ?

About JAX, Google is actually well prepared .

As early as 2018 In the year , It was built by a three person team of Google brain .

The research results were published in a paper entitled Compiling machine learning programs via high-level tracing Papers :

ee74407d41d345c59026d04597e691f9.png

Jax It is used for high performance numerical computation Python library , And deep learning is just one of the functions .

9a5fe5e5b7bbd11a5fde1c0e449d5598.png

Since its birth , Its popularity has been rising .

The biggest characteristic is fast .

Feel for an example .

For example, finding the sum of the first three powers of a matrix , use NumPy Realization , The calculation takes about 478 millisecond .

9bd70c5dd43d20920ca2909351c391af.png

use JAX Just need 5.54 millisecond , Than NumPy fast 86 times .

df6613d8090d7ef72f7b1f91349de376.png

Why so soon? ? There are many reasons , Include :

1、NumPy Accelerator .NumPy The importance of , use Python Engage in scientific computing and machine learning , No one can live without it , But it has never been natively supported GPU Wait for hardware acceleration .

JAX The calculation function of API All based on NumPy, Can make the model very easy in GPU and TPU Up operation . This point has been grasped by many people .

2、XLA.XLA(Accelerated Linear Algebra) Is to accelerate linear algebra , An optimization compiler .JAX Based on the XLA above , A substantial increase in JAX Calculate the upper limit of speed .

3、JIT. Researchers can use XLA Convert your own functions to real-time compilation (JIT) edition , It is equivalent to adding a simple function modifier to the calculation function , The computing speed can be increased by several orders of magnitude .

besides ,JAX And Autograd Fully compatible with , Support automatic difference , adopt grad、hessian、jacfwd and jacrev Equifunction conversion , Support reverse mode and forward mode differentiation , And the two can be composed in any order .

Of course ,JAX There are also some shortcoming On the body .

such as :

1、 although JAX Known as an accelerator , But it is not aimed at CPU Each operation in the calculation is fully optimized .

2、JAX Too new , No shape imaging TensorFlow Such a complete basic ecology . So it hasn't been launched by Google in the form of molded products .

3、debug The time and cost required are uncertain ,“ side effect ” Not entirely clear .

4、 I won't support it Windows System , Can only run in the above virtual environment .

5、 No data loader , Must borrow TensorFlow or PyTorch Of .

……

For all that , Simple 、 Flexible and easy to use JAX Or take the lead in DeepMind It's popular in China .2020 Some in-depth learning libraries were born in Haiku and RLax And so on are all based on it .

This year ,PyTorch One of the original authors Adam Paszke, Also joined full-time JAX The team .

at present ,JAX Our open source project is in GitHub Previous 18.4k Star sign .

It is worth noting that , in the meantime , There are many voices indicating that it is likely to replace TensorFlow.

On the one hand, it's because JAX The strength of the , On the other hand, it is mainly related to TensorFlow For many reasons .

Why did Google switch to JAX?

Born in 2015 Year of TensorFlow, It used to be all the rage , Soon after its launch, it surpassed Torch、Theano and Caffe Wait for a group “ Fashionable guy ”, Become the most popular machine learning framework .

However, in 2017 year , A new look PyTorch“ Making a comeback ”.

This is a Meta be based on Torch Built machine learning library , Because it's easy to get started 、 Easy to understand , Soon it was favored by many researchers , Even more than TensorFlow The trend of .

by comparison ,TensorFlow But it becomes more and more bloated in frequent updates and interface iterations , Gradually lost the trust of developers .

( from Stack Overflow According to the proportion of questions on ,PyTorch It's going up year by year ,TensorFlow But has been stagnant )

643346a46fc2d6f6ba550b81a94d0198.png

In the competition ,TensorFlow Their shortcomings are gradually exposed ,API unstable 、 The implementation is complex 、 The problem of high learning cost has not been solved with the update , Instead, the structure becomes more complex .

by comparison ,TensorFlow But they didn't continue to play better “ Operational efficiency ” Equal advantage .

In academia ,PyTorch The utilization rate of is gradually surpassing TensorFlow.

Especially at the top of each major building ACL、ICLR in , Use PyTorch The implemented algorithm framework has occupied more than in recent years 80%, by comparison TensorFlow The usage rate of is declining .

And that's why , Google can't sit still , Try to use JAX Recapture the support for the machine learning framework “ Dominance ”.

although JAX Not in name “ A common framework for deep learning ”, However, from the beginning of the release , Google's resources have been going to JAX tilt .

One side , Google brain and DeepMind Gradually build more libraries on JAX On .

Including Google brain Trax、Flax、Jax-md, as well as DeepMind The neural network library of Haiku And reinforcement learning library RLax etc. , It's all based on JAX Built .

According to Google officials :

JAX Ecosystem development , Consideration will also be given to ensuring that it is compatible with existing TensorFlow library ( Such as Sonnet and TRFL) The design of the ( As far as possible ) bring into correspondence with .

On the other hand , More projects are starting to be based on JAX Realization , The recent explosion of DALL·E mini The project is one of them .

Because we can make better use of Google TPU The advantages of ,JAX In terms of operation performance, it is better than PyTorch It's much better , More previously built on TensorFlow Industrial projects in China are also turning to JAX.

Some netizens even teased JAX The reason for the current explosion : May be TensorFlow Users of this framework can't stand it anymore .

2c6f25d54e7098df292ebb74e4fcdbae.png

that ,JAX Is there any hope to replace TensorFlow, Become and PyTorch The new forces of confrontation ?

Which framework do you prefer ?

overall , Many people still stand firm PyTorch.

They don't seem to like the speed with which Google comes up with a new framework every year .

321cf4a6afeccf70819f6dbdb10001ac.png

“JAX Although very attractive , But not enough “ revolutionary character ” The ability to push people to abandon PyTorch To use it .”

204c52040abd14020c753b8892cd8f56.png

But optimistic JAX There are not a few .

Someone said ,PyTorch It's perfect , but JAX Also narrowing the gap .

ba788851d521396edd3871b72fba28f9.png

There are even some crazy people JAX hit call, It means that it is better than PyTorch It's going to be tough 10 times , said : If Meta If you don't keep pushing, Google will win .( Manual formation )

b3877eacedeed017b660fed6a0032627.png

however , There's always something wrong care Who loses and who wins , They have a long-term vision :

No best , Only better . The most important thing is more players and good idea All join in , Make open source equal to really good innovation .

2ec5622c23807c9796eff9f93c245a39.png

Project address :
https://github.com/google/jax

Reference link :
https://twitter.com/jeremyphoward/status/1538380788324257793
https://twitter.com/ylecun/status/1538419932475555840
https://mp.weixin.qq.com/s/AoygUZK886RClDBnp1v3jw
https://www.deepmind.com/blog/using-jax-to-accelerate-our-research
https://github.com/tensorflow/tensorflow/issues/53549

原网站

版权声明
本文为[QbitAl]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/176/202206250044295273.html