Stories
Slash Boxes
Comments

SoylentNews is people

posted by chromas on Tuesday February 19 2019, @10:00PM   Printer-friendly
from the put-it-on-the-block-chain dept.

Deep learning may need a new programming language that’s more flexible and easier to work with than Python, Facebook AI Research director Yann LeCun said today. It’s not yet clear if such a language is necessary, but the possibility runs against very entrenched desires from researchers and engineers, he said.

LeCun has worked with neural networks since the 1980s.

“There are several projects at Google, Facebook, and other places to kind of design such a compiled language that can be efficient for deep learning, but it’s not clear at all that the community will follow, because people just want to use Python,” LeCun said in a phone call with VentureBeat.

“The question now is, is that a valid approach?”

Python is currently the most popular language used by developers working on machine learning projects, according to GitHub’s recent Octoverse report, and the language forms the basis for Facebook’s PyTorch and Google’s TensorFlow frameworks.

[...] Artificial intelligence is more than 50 years old, but its current rise has been closely linked to the growth in compute power provided by computer chips and other hardware.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by goodie on Wednesday February 20 2019, @01:34AM

    by goodie (1877) on Wednesday February 20 2019, @01:34AM (#803804) Journal

    Being remotely aware of the type of work that people like LeCun are doing, I'd venture that what they want is something that does only deep learning, where parallelization comes built-in etc. not through yet another module that you need to integrate. Something highly optimized for it. Of course, that would suck in terms of interoperability when you actually have to put it in production and interact with external data etc. Or maybe something purely tailored for deep learning that you can use and pack it as an interface when you need these other things. Somewhat like when you build a computing pipeline in Matlab or Octave it's really neat and parsimonious, but then you're locked up within that ecosystem so you end up using python...

    It may be a bad parallel, but it's a bit like R vs python to me. I used to not understand R and want to do my stats stuff on python because I am a programmer. Fast forward a few months and I don't understand why you'd want to use python to do anything like that, even with numpy, scipy, scikit-learn etc. R is much better once you get the hang of working with dataframes. BUT, once you want to load data, manipulate it, use web services etc. and basically do more "traditional" programming, python is better. R has packages to do these things too, but it feels awkward because it's not the way the language was thought out. It is a statistical programming language, not a general purpose programming language. Anyway that's how I see it.

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3