Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Thursday May 28 2015, @04:47PM   Printer-friendly
from the application-programming-INTERFACE dept.

The Obama administration has asked the United States Supreme Court to decline Google's appeal against a 2014 federal appeals court ruling finding copyright infringement of Oracle's Java code:

The case involves how much copyright protection should extend to the Java programing language. Oracle won a federal appeals court ruling last year that allows it to copyright parts of Java, whilst Google argues it should be free to use Java without paying a licencing fee. Google, which used Java to design its Android smartphone operating system, appealed to the U.S. Supreme Court. The high court then asked the Obama administration in January for its opinion on whether it should take the case because the federal government has a strong interest. The Federal Trade Commission, for instance, must ensure companies do not break antitrust laws when claiming software copyright protection against each other.

According to Google, an Oracle victory would obstruct "an enormous amount of innovation" because software developers would not be able to freely build on each others' work. But Oracle says effective copyright protection is the key to software innovation.

In the court filing on Tuesday, U.S. Solicitor General Donald Verrilli said Google's argument that the code is not entitled to copyright protection lacks merit and did not need to be reviewed by the Supreme Court. Verrilli added that Google had raised important concerns about the effect that enforcement of Oracle's copyright could have on software development, but said those issues could be addressed via further proceedings on Google's separate "fair use" defence in San Francisco federal court.

Google v. Oracle, U.S. Supreme Court, No. 14-410


[Editor's Comment: Original Submission]

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Nerdfest on Thursday May 28 2015, @07:50PM

    by Nerdfest (80) on Thursday May 28 2015, @07:50PM (#189277)

    There are many languages a lot worse than Java. It sucks that Oracle owns it, but Java and other JVM based languages are pretty good in general in my opinion. If you find Java too strictly typed or verbose, try Groovy. I recently cut the amount of source in one large package by about 50% by switching to to Groovy. It took about half an hour to switch the source to Groovy, delete a lot of the boilerplate code and do a minor change to the build process. Less code to maintain, more readability, and still works perfectly with Java.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 3, Interesting) by FatPhil on Friday May 29 2015, @09:02AM

    by FatPhil (863) <{pc-soylent} {at} {asdf.fi}> on Friday May 29 2015, @09:02AM (#189575) Homepage
    No language apart from Java that I know requires that you create a new Integer object to have the integer value 0, because integers aren't objects. (Whilst at the same time permitting the constant integer-but-not-an-object ``0'' to have some meaning.)

    No language apart from Java that I know of would take 5 major versions of the language to realise that the inefficiency caused by the above was horrifically stupid, and then, rather than fixing the problem, simply cache all pre-made integers between -128 and 127, so that the overhead would be a bit lower. For 256 Integers, that is.

    No language apart from Java that I know of has such baroque syntax for dealing with bignums, in particular when trying to use not-an-object integers as parameters to bignum methods. ``y = x.multiply(BigInteger.valueOf(10));'' is somehow deemed superior to ``y = x*10;''

    No language apart from Java that I know would first make System.in mutable, and then when people started changing it make it final, thus immutable, and then later add a setter setIn() so that you can *change the value of a final variable*.

    It's a turd, there's no other word for it. And like COBOL, yes it will still be around in decades. The IT world's like that.
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves