Stories
Slash Boxes
Comments

SoylentNews is people

posted by chromas on Tuesday February 19 2019, @10:00PM   Printer-friendly
from the put-it-on-the-block-chain dept.

Deep learning may need a new programming language that’s more flexible and easier to work with than Python, Facebook AI Research director Yann LeCun said today. It’s not yet clear if such a language is necessary, but the possibility runs against very entrenched desires from researchers and engineers, he said.

LeCun has worked with neural networks since the 1980s.

“There are several projects at Google, Facebook, and other places to kind of design such a compiled language that can be efficient for deep learning, but it’s not clear at all that the community will follow, because people just want to use Python,” LeCun said in a phone call with VentureBeat.

“The question now is, is that a valid approach?”

Python is currently the most popular language used by developers working on machine learning projects, according to GitHub’s recent Octoverse report, and the language forms the basis for Facebook’s PyTorch and Google’s TensorFlow frameworks.

[...] Artificial intelligence is more than 50 years old, but its current rise has been closely linked to the growth in compute power provided by computer chips and other hardware.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Tuesday February 19 2019, @10:39PM (5 children)

    by Anonymous Coward on Tuesday February 19 2019, @10:39PM (#803721)

    What does Python (or any other language) lack that they're looking for? As a general rule when a language like Python lacks something such as performance, they code a library in C, or some other low-level language and call into it. The article didn't seem to give me any real answer on why this wouldn't work. They went on about not being able to use the GPU properly, but didn't say why some kind of API callable from other languages coulnd't fix that. I haven't done any GPU programming, but I seem to recall that even C needed special APIs to manage it. They didn't throw out C though, or even change the syntax. When C *did* have a shortcoming vs. FORTRAN, they added the restrict keyword. Problem solved. So. What's the shortcoming with Python, why can't a library solve it, and if new syntax is needed what kind of syntax?

  • (Score: 3, Interesting) by bzipitidoo on Tuesday February 19 2019, @11:22PM (1 child)

    by bzipitidoo (4388) on Tuesday February 19 2019, @11:22PM (#803752) Journal

    There are thousands of programming languages, and dissatisfied people are still inventing more, for the good reason that the whole menagerie leaves much to be desired.

    Building on existing languages seems to be in vogue this decade. One way is the transcompiler to turn source code in their pet language into source code in a mainstream language. We're not talking merely libraries such as jQuery, There's TypeScript, CoffeScript, LiveScript, and several others that transcompile to JavaScript. Another way is to tie into existing libraries. Kotlin is able to connect to java and Android libraries.

    • (Score: 1, Funny) by Anonymous Coward on Wednesday February 20 2019, @12:09AM

      by Anonymous Coward on Wednesday February 20 2019, @12:09AM (#803772)

      RatFor forever!!

      https://en.wikipedia.org/wiki/Ratfor [wikipedia.org]

  • (Score: 3, Interesting) by goodie on Wednesday February 20 2019, @01:34AM

    by goodie (1877) on Wednesday February 20 2019, @01:34AM (#803804) Journal

    Being remotely aware of the type of work that people like LeCun are doing, I'd venture that what they want is something that does only deep learning, where parallelization comes built-in etc. not through yet another module that you need to integrate. Something highly optimized for it. Of course, that would suck in terms of interoperability when you actually have to put it in production and interact with external data etc. Or maybe something purely tailored for deep learning that you can use and pack it as an interface when you need these other things. Somewhat like when you build a computing pipeline in Matlab or Octave it's really neat and parsimonious, but then you're locked up within that ecosystem so you end up using python...

    It may be a bad parallel, but it's a bit like R vs python to me. I used to not understand R and want to do my stats stuff on python because I am a programmer. Fast forward a few months and I don't understand why you'd want to use python to do anything like that, even with numpy, scipy, scikit-learn etc. R is much better once you get the hang of working with dataframes. BUT, once you want to load data, manipulate it, use web services etc. and basically do more "traditional" programming, python is better. R has packages to do these things too, but it feels awkward because it's not the way the language was thought out. It is a statistical programming language, not a general purpose programming language. Anyway that's how I see it.

  • (Score: 3, Interesting) by RamiK on Wednesday February 20 2019, @03:15AM (1 child)

    by RamiK (1813) on Wednesday February 20 2019, @03:15AM (#803838)

    What does Python (or any other language) lack that they're looking for?

    That's the wrong question. Off that bat, efficient programming languages describe the hardware since non zero-cost abstraction are obstructions. VLIW GPUs had their own languages and compilers until fairly recently when GPUs switched to something more C compatible for compute. Now with Moore's law running its course, Spectre making OOO cores inappropriate for sensitive operations, AI silicon not being C compatible, and GPUs no longer making sense for cryptomining, we can fully expect half a dozen different compute cores getting glues together each with their own ISAs and having their own languages and compilers.

    So, the question is, what is it in python & C++ that's incompatible with Facebook's new AI silicon. If I had to guess, it's the memory model.

    --
    compiling...
    • (Score: 0) by Anonymous Coward on Wednesday February 20 2019, @06:35AM

      by Anonymous Coward on Wednesday February 20 2019, @06:35AM (#803894)

      So people in the domain who are not primarily programmers want a domain-specific language. Makes sense.