Stories
Slash Boxes
Comments

SoylentNews is people

posted by chromas on Tuesday February 19 2019, @10:00PM   Printer-friendly
from the put-it-on-the-block-chain dept.

Deep learning may need a new programming language that’s more flexible and easier to work with than Python, Facebook AI Research director Yann LeCun said today. It’s not yet clear if such a language is necessary, but the possibility runs against very entrenched desires from researchers and engineers, he said.

LeCun has worked with neural networks since the 1980s.

“There are several projects at Google, Facebook, and other places to kind of design such a compiled language that can be efficient for deep learning, but it’s not clear at all that the community will follow, because people just want to use Python,” LeCun said in a phone call with VentureBeat.

“The question now is, is that a valid approach?”

Python is currently the most popular language used by developers working on machine learning projects, according to GitHub’s recent Octoverse report, and the language forms the basis for Facebook’s PyTorch and Google’s TensorFlow frameworks.

[...] Artificial intelligence is more than 50 years old, but its current rise has been closely linked to the growth in compute power provided by computer chips and other hardware.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Interesting) by Anonymous Coward on Wednesday February 20 2019, @12:35AM

    by Anonymous Coward on Wednesday February 20 2019, @12:35AM (#803787)

    Python is roughly 1% the speed of C. That is just pitiful.

    Python may work if you get free computers, air conditioning, power, and rack space. You'll also need to not have any concern about latency.

    If that stuff isn't free, what are you doing??? You are spending 100x as much as you need to be spending. You could lop 99% off of your costs.

    For the latency, throwing more hardware at the problem does not help. The speed difference for single-threaded tasks is at least 15 years worth of computing industry change, probably more. You'll be computing like it's 1999.

    Starting Score:    0  points
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  

    Total Score:   2