
from the swift-language-but-not-so-swift-name dept.
Apple surprised the audience at its Worldwide Developers Conference in San Francisco on Monday with a tool that few attendees expected: a new programming language for iOS and OS X development called Swift (https://developer.apple.com/swift/). There already is a programming language called Swift (http://swift-lang.org/main/) that was developed by the National Science Foundation, some other government agencies, and the University of Chicago for use in parallel computing applications. This isn't that. What it is, is an entirely new syntax that -- in the words of Apple senior VP Craig Federighi, who unveiled it during the Monday morning WWDC keynote -- aims to be "Objective-C without the baggage of C."
Some of that "baggage" will already be familiar to developers who cut their teeth on C but later moved on to scripting languages such as Python (and Federighi compared Swift to Python several times during his presentation). Like scripting languages but unlike C, Swift lets you get straight to the point. The single line println("Hello, world") is a complete program in Swift. Note, also, that you don't even have to end the statement with a semicolon, as you do in C. Those are optional, unless you're combining multiple statements on a single line; i.e. a semi-colon is a statement separator rather than a statement terminator.
In addition to its online documentation, Apple has released an e-book, The Swift Programming Language, that's a free download (https://itunes.apple.com/us/book/the-swift-programming-language/id881256329) from the iBooks Store. To start working with the language itself, you'll need to download the beta release of XCode 6 (https://developer.apple.com/xcode/downloads/), which includes tutorials to get you going.
Related Stories
Software engineer, Dave DeLong, has written an 18-part series on building an HTTP framework in Swift. Apple's Swift programming language is a general-purpose, open source, compiled programming language intended to replace Objective-C. It is licensed under the Apache 2.0 license. In his series, Dave covers an Intro to HTTP, Basic Structures, Request Bodies, Loading Requests, Testing and Mocking, Chaining Loaders, Dynamically Modifying Requests, Request Options, Resetting, Cancellation, Throttling, Retrying, Basic Authentication, OAuth Setup, OAuth, and Composite Loaders.
Over the course of this series, we've started with a simple idea and taken it to some pretty fascinating places. The idea we started with is that a network layer can be abstracted out to the idea of "I send this request, and eventually I get a response".
I started working on this approach after reading Rob Napier's blog post on protocols on protocols. In it, he makes the point that we seem to misunderstand the seminal "Protocol Oriented Programming" idea introduced by Dave Abrahams Crusty at WWDC 2015. We especially miss the point when it comes to networking, and Rob's subsequent posts go in to this idea further.
One of the things I hope you've realized throughout this blog post series is that nowhere in this series did I ever talk about Codable. Nothing in this series is generic (with the minor exception of making it easy to specify a request body). There is no mention of deserialization or JSON or decoding responses or anything. This is extremely deliberate.
(Score: 3, Informative) by meisterister on Tuesday June 03 2014, @04:55PM
Why don't they make a language that's like Objective-C but without the baggage of Objective-C?
(May or may not have been) Posted from my K6-2, Athlon XP, or Pentium I/II/III.
(Score: 0) by Anonymous Coward on Tuesday June 03 2014, @08:40PM
why is this moderated flamebait?
I haven't used Objective-C myself, but everyone, that has, told me that Objective-C is annoying.
It should be moderated funny if not informative.
Also apropos moderation: Users should be able to keep moderation points much longer. It's so annoying to have them for such a short time. Because sometimes you find a comment that absolutely deserves moderation and then you just don't have them.
(Score: 2) by BasilBrush on Tuesday June 03 2014, @09:34PM
They have. For example the number one complaint of people new to Obj-C is the square bracket method calling convention. Swift uses the more conventional periods and parentheses.
However it does retain all the good stuff from Obj-C.
Hurrah! Quoting works now!
(Score: 3, Informative) by mrMagoo on Wednesday June 04 2014, @01:41AM
If you take Objective-C, and remove the C, you get SmallTalk.
I don't know how many folks noticed, but the chap that gave the Swift Demo is [url=http://en.wikipedia.org/wiki/Chris_Lattner]Chris Lattner[/url]. That's some fairly big juju.
"Good judgment comes from experience. Experience comes from bad judgment." -Originally attributed to Nasrudin
(Score: 1) by mrMagoo on Wednesday June 04 2014, @01:43AM
I keep forgetting to not use BBCode...
"Good judgment comes from experience. Experience comes from bad judgment." -Originally attributed to Nasrudin
(Score: 1, Interesting) by Anonymous Coward on Tuesday June 03 2014, @05:06PM
What advantage are they seeking by getting rid of statement terminators? I like those, they make it clear when a statement is finished.
Aside from the minor benefit of saving a single character per statement, what is the reason for dropping them?
(Score: 5, Funny) by Tork on Tuesday June 03 2014, @05:23PM
🏳️🌈 Proud Ally 🏳️🌈
(Score: 2) by c0lo on Tuesday June 03 2014, @05:25PM
Can't have so small chances for bugs due to a lax syntax [gotofail.com], can we? I mean, without 0-days, how would the NSA take care of US citizens' security?
(Look at Pascal for example: so strict, hardly a syntax bug can slip past the compiler: how much new software is written in Pascal nowadays?)
Besides, with the war on drugs and Afghanistan waning, new concepts and fronts need to be open for war.
https://www.youtube.com/@ProfSteveKeen https://soylentnews.org/~MichaelDavidCrawford
(Score: 3, Funny) by VLM on Tuesday June 03 2014, @05:38PM
"Besides, with the war on drugs and Afghanistan waning, new concepts and fronts need to be open for war."
There's always the war on freedom. More seriously I wouldn't mind a jihad on mutable objects. Just go functional and be done with it.
(Score: 2) by HiThere on Wednesday June 04 2014, @12:10AM
Pascal doesn't handle utf-8 any better than does C. It's difficult to write parallel programs. Etc.
OTOH, if Pascal had been reasonably updated I think it would have been an excellent language.
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 2) by Aiwendil on Wednesday June 04 2014, @05:58AM
Did you ever take a look at Objective Pascal/Delphi?
But yes, Pascal suffers from a delay before features of other languages reach it. (Then again, Ada suffers from that its features doesn't get popular until another language implements it - so being updated/first isn't always a good thing)
(Score: 0) by Anonymous Coward on Thursday June 05 2014, @09:34AM
Actually the Extended Pascal standard suffered mostly from not being implemented by many compilers. Maybe it was an error to publish it as separate standard instead of as update to the Pascal standard, but then, most Pascal compilers already largely ignored the Pascal standard anyway, and on the other hand, C++ later proved that you can be successful with a new standard based on an existing one.
(Score: 3, Informative) by theluggage on Tuesday June 03 2014, @05:46PM
Uh - my guess at a boring answer? I guess its because Swift source code supports Unicode: e.g. you can have Greek letters as variable names (or emoticons, but then you'd really have to be shot for the good of humanity). That means there is a single formal definition [wikipedia.org] of what characters count as line terminators.
Just be thankful that's as far as they ventured down the road to hell that is significant whitespace, or we could have had another Python on our hands...
(Score: 2) by BasilBrush on Tuesday June 03 2014, @08:39PM
Losing an unnecessary punctuation character makes for cleaner code.
If you like them, you are in luck because there's nothing stopping you from using them. They are still there, they are just optional unless you want multiple statements per line.
Hurrah! Quoting works now!
(Score: 0) by Anonymous Coward on Tuesday June 03 2014, @09:14PM
Then practice what you preach and stop using periods in your prose After all, as long as there is a capital letter at the beginning of the sentence or a newline, then the end of the sentence is obvious Of course there is still some ambiguity with acronyms and the like, so continue using periods there Of course there is the slight possibility that less punctuation can in some cases make things harder to read but I suppose that's the price of progress
(Score: 2, Insightful) by Tramii on Tuesday June 03 2014, @10:18PM
Cute, but wrong
If you wanted your example post to be completely accurate, you would have written your post with hard returns at the end of every sentence
If you don't add the hard returns, it would still require that you add periods in between sentences
Frankly, I think would be weird to leave off the semicolons, but I don't foresee any issues with the way Swift implemented it
(Score: 0) by Anonymous Coward on Tuesday June 03 2014, @11:03PM
I see :)
I would note though that GP claimed 'losing an unnecessary punctuation character' made code cleaner; I was trying to argue by example by demonstrating that unnecessary punctuation, like periods, can make prose more readable.
(Score: 2, Interesting) by zsau on Wednesday June 04 2014, @11:45PM
seeing as there is no convention of interpreting a capital letter as a separator (but there is a convention of interpreting a new line as a separator)
and seeing as it's increasingly common to see newlines added in english prose even where paragraph boundaries had not been common
(i've even seen it used in the middle of sentences, e.g. in a transcript of a British minister in a speech about Scottish independence)
then really, if you're going to say "let's get rid of unnecessary english punctuation", it's capital letters and fullstops you'd be killing, replacing them with newlines boundaries
both of which are not uncommon anyway: one day, all english prose might look like this
(Score: 2) by BasilBrush on Tuesday June 03 2014, @10:19PM
Nice try. But you didn't use new lines to separate either, so by analogy with swift you'd need periods in the paragraph you wrote.
When using English in a form where a newline is a separator, such as a bullet pointed list, or title and subtitle, it is indeed common to not terminate each item with a period.
Hurrah! Quoting works now!
(Score: 0) by Anonymous Coward on Tuesday June 03 2014, @10:59PM
FWIW I use punctuation in lists, eg
1) Thing a;
2) Thing b; and,
3) Thing c.
(Score: 2) by BasilBrush on Wednesday June 04 2014, @03:25PM
You're doing it wrong.
But hey, just like English, Swift allows you to put those statement terminators in there if you really want.
Hurrah! Quoting works now!
(Score: 2) by tibman on Tuesday June 03 2014, @10:26PM
Would mod you funny if able : )
SN won't survive on lurkers alone. Write comments.
(Score: 3, Interesting) by forsythe on Tuesday June 03 2014, @10:12PM
I'm not really so interested in what I can do with formatting, or how pretty my code looks (within reason). I'll adapt. I'm interested in the wonderful, subtle bugs that other programmers can create, then leave for me to encounter. If optional semicolons leads to rules that require people to write blog posts like this [benalman.com] to illustrate them, that's rather counterproductive to the notion of `cleaner code'.
Of course, if they're done well, and no unintuitive behavior of any sort arises, then good for Apple. But somehow I doubt that.
(Score: 3, Informative) by BasilBrush on Tuesday June 03 2014, @10:28PM
I'm not a Javascript user, so I can't comment on semi-colon use there. However I do recall reading that Javascript was originally hacked together in about 4 days by one person. So maybe that explains the lack of rationality there.
Swift on the other hand has been in development at Apple for 4 years, before any public release, including major use by their developer tools team. If there were ambiguities introduced by optional semicolons I would expect they'd have spotted them by now, and made semicolons mandatory again.
There's been plenty of other languages that didn't need semi-colon terminators. It's hardly an impossible task.
Hurrah! Quoting works now!
(Score: 2) by Aiwendil on Tuesday June 03 2014, @05:30PM
Semicolon is great for when CR/LF and such has been messed up when crossing platforms or passing the code through external tools.
The begin .. end; or { } is good to actually give a visual clue about where the logical breaks are in the "flow" of the code, not to mention it makes nesting of subprograms easier to follow, not to mention it also creates consistency for the start-end of blocks triggered by a loop or if-statement (for instance
while true { } how is that done without the extra cruft?)
And don't get me started on how good it is to actually have named blocks called from an exit-statement (even when not in a loop)
Also, a language that allows for shortcuts sadly enough tends to cause headaches (implicit typecasts and allowing use of non-defined variables are downright dangerous [imagine having a variable called "grey" to define just what shade to use, now imagine someone else starting to use "gray" in the same code thinking it is the same])
(Score: 0) by Anonymous Coward on Tuesday June 03 2014, @05:40PM
Implicit typecasts, when restricted to the proper set of operations, are absolutely not dangerous (even Pascal had them!). The problem with languages like C is that they have too many of them, most of which are simply not a good idea (this includes some which at first sight look like a good idea, like the implicit conversion from signed to unsigned types).
Restrict type conversions to the cases where they make sense, and you'll have no problems with them. Note that without type conversions, object oriented programming would be basically impossible with statically typed languages, because you need implicit conversion from pointer/reference to derived to pointer/reference to base.
(Score: 3, Interesting) by Aiwendil on Tuesday June 03 2014, @07:02PM
I actually like how Ada approaches this. In Ada on defines types first, and then - if needed - subtypes and conversion between subtypes that share the same type is implicit (with a rangecheck in case the origin subtype does not fit completly within the target subtype) but to convert outside of this type requires explicit typecasting.
I agree that typecasting is good to have (and many things would indeed be impossible (or well - a lot harder) without it, but it is implicit typecasting that is bad (I'm all for explicit typecasting)).
However, I fail to see why one needs implicit conversion for OOP (ie. I fail to see why explicit typecasting wouldn't suffice)
(Score: 0) by Anonymous Coward on Thursday June 05 2014, @09:41AM
Well, formally you of course could do only explicit typecasting in statically-typed OOP as well, but it would fully defeat the point of OOP (basically every derived class would automatically violate the Liskov Substitution Principle in this regard). Here's an explicit example (using C++ syntax):
(Score: 3, Interesting) by VLM on Tuesday June 03 2014, @05:48PM
"Swift is the result of the latest research on programming languages"
... as of 1985 ...
Now that I've already made the statement, may as well try to verify its truth, does it have anything more modern / exciting than duck typing and no semicolons?
It might be a good language, but if you PR me into "innovative" and "new" with something from the 80s I'll feel worse than if they just dumped it and asked "take a look".
Given the PR I expected something like they're going to include lighttable and clojure in every osx install or something. Then I eagerly read on and find... the preprocessor inserts implicit semicolons.... oh thats a "Turing Award" winning innovation right there, for sure, LOL.
(Score: 5, Insightful) by TheRaven on Tuesday June 03 2014, @07:41PM
There's not much novel in Swift, but it does look like the language you get when you look at a lot of existing language and cherry-pick the nice bits. The big omission is concurrency support (no message passing / channels). Don't forget that the main goal of Swift is binary compatibility with Objective-C: it is intended to let developers move to a language where it's easy to migrate to an environment that allows more interesting things.
I'm generally pretty critical of Apple, but Swift is the first 'new' programming language I've seen in a long time and not said 'WTF?' to. It's not especially novel, but it is an example of what Apple used to be well-known for: taking existing ideas, polishing the hell out of them, and coming up with a clean and easy to use product.
sudo mod me up
(Score: 2) by BasilBrush on Tuesday June 03 2014, @08:57PM
Who said it uses duck typing? It's a type-safe language.
If you read the book, you'll find lots of new stuff.
One fantastic improvement over C is conditional values. When you declare a value of any type (not just what would be a pointer in C) you can make it conditional. This means that as well as all the normal values of that type, it can have no value (nil). If it's not conditional, it can't have the nil value.
This means that it is a definite fact both to the programmer and the compiler whether a particular value can have a nil value that must be considered. No need to guess, or check just in case as in C.
Furthermore, once you check that a value is not nil, you get an "unwrapped version" of that value which is not conditional, so you know that you don't again need to check it for nil.
And yes, it has a lighttable like IDE. No idea why you mentioned the completely different and obscure language of Clojure though.
Hurrah! Quoting works now!
(Score: 0) by Anonymous Coward on Thursday June 05 2014, @12:35PM
That's also not a terribly new concept. Yes, it's not there in C (not directly, at least; you can easily emulate it using a pointer, which given the low-level-nature of C is the right choice). But it has been in several functional languages for many years, has been implemented in Boost for quite some time (boost::optional), and is considered for inclusion in the next C++ standard.
(Score: 0) by Anonymous Coward on Wednesday June 04 2014, @02:55AM
Eighties? Try the sixties. John McCarthy's original Lisp already had duck typing and neither did its syntax have semicolons. It also had garbage collection, first class functions, metaprogramming with a flexibility that no language outside of the Lisp family can even touch, and many other features that are only now becoming mainstream. I wonder what features the next big fad language is going to steal from Lisp. First-class ontinuations? Nah.
(Score: 2) by DrMag on Tuesday June 03 2014, @05:48PM
http://xkcd.com/927/ [xkcd.com]
Is the problem really the existing languages, or is it laziness/insufficient education?
(Score: 3, Interesting) by Sir Garlon on Tuesday June 03 2014, @06:42PM
High-level languages are for humans, not computers. Therefore, programming languages can be designed to support a set of implementation patterns that are usable to a population of developers as they approach certain general classes of problems.
It seems perfectly reasonable to me that a new language can be tailored to be efficient in a certain domain. Matlab springs to mind: its fundamental data structure is a matrix, and if you learn how to express your computational problems as matrix algebraic problems, Matlab is super fast and efficient to program in. It was mainly developed for electrical engineers, who work with matrices every day. Note the confluence here of a population (electrical engineers) and a dominant paradigm (matrix algebra).
I wouldn't say a new programming language is a solution to the problem of bad developers. Instead I would say a new programming language can be an optimized tool for building programs in a certain way. If you have a population of developers who are ready to think and solve problems in a certain way, then of course you can optimize a tool (really, a tool chain) to increase productivity and decrease error rates.
In addition to the mental model that a language supports, there is also the question of how it fits into your tool chain. For example, Unicode support presumably matters a lot if you need to support internationalization to Asian countries. One of the most common criticisms of Python is that whitespace is significant. When I learned Python, I realized that is equivalent to saying "Python wants me to change editors." When I got over that, I made peace with the oddball requirement. Some people do not want to change their tools and for good reason; Python is probably not for them.
In the case of Apple, Apple has long held the philosophy of vertically integrating the whole user experience. That includes the user experience of developers as well as end users. My experience is limited to dabbling with some iOS apps but it seems to me there is an "Apple way" to design a UI; there is a specific Apple IDE, XCode, that is pretty well mandatory. Under those circumstances it makes perfect sense for Apple to (re)design a programming language to optimally fit into that ecosystem.
[Sir Garlon] is the marvellest knight that is now living, for he destroyeth many good knights, for he goeth invisible.
(Score: 2) by BasilBrush on Tuesday June 03 2014, @09:02PM
If you are familiar with Objective-C, and the Apple patterns for making Apps, then read the Swift book, it's pretty obvious that this new language is a great replacement for Obj-C in ways that no existing language is.
It doesn't mean that there aren't other good languages. Just that this one fits existing Obj-C paradigms better, whilst introducing lots of great new stuff, and getting rid of anachronisms.
Hurrah! Quoting works now!
(Score: 2, Interesting) by Zanothis on Tuesday June 03 2014, @06:10PM
yOffsetForTime = { i in
return 80 * sin(i / 10.0)
}
After seeing their enumerations, the keyword "in" seems like the worst possible choice for this syntax (stick around for what it does). My coworker and I spent about 15 minutes trying to find out in the language documents WTF that block of code was supposed to do.
Apple's goal seems to have been to answer the question: "What's the most confusing syntax for lambdas that we can come up with?"
Does anyone know of any other language that uses for-in loops and uses "in" for denoting lambdas?
(Score: 2) by NCommander on Tuesday June 03 2014, @06:34PM
Holy eyebleed. I haven't had a chance to look at the Shift pages yet, but this feels like if C and Forth had freaking baby with a bit of Python thrown in.
I don't mind odd syntaxs (I actually don't even mind worth with forth either), but thats just ... *ow*.
Seriously, I know a fair number of people dislike ObjC (its syntax is kinda wonky), but it works well, and intergrates nicely with basically every existing C and C++ codebase in existence. How the heck does this actually improve anything?!
Still always moving
(Score: 1) by NickM on Tuesday June 03 2014, @08:55PM
I a master of typographic, grammatical and miscellaneous errors !
(Score: 0) by Anonymous Coward on Thursday June 05 2014, @12:43PM
So it corresponds to the following C++ code:
However, I don't see the big advantage over a simple function (the only difference at the call site being round parentheses instead of square ones).
(Score: 2) by TheRaven on Wednesday June 04 2014, @08:43AM
sudo mod me up
(Score: 1) by mrMagoo on Tuesday June 03 2014, @06:54PM
I've been programming Objective-C for years (Mac and iOS). I also write a lot of PHP.
I'm not so thrilled with either language (I came from C++), but they are the tools I need to get a job done.
I've learned (and subsequently forgotten) tons of languages in my time.
If I can learn XSLT, Swift ain't no thing.
However, high-level languages do tend to have a certain "faddishness."
ObjC was forced on us, because it was the only language that could be used to efficiently program Macs (and, later on, iOS devices).
Swift, like xaml, is an "extra" language; aimed at the folks that are trying to get started programming. It won't be forced.
Anyone remember xaml [microsoft.com]?
Anyone?
Bueller?
"Good judgment comes from experience. Experience comes from bad judgment." -Originally attributed to Nasrudin
(Score: 3, Informative) by BasilBrush on Tuesday June 03 2014, @09:25PM
I think you misjudge it. First of all it contains every feature that Objective-C has, plus others that Obj-C doesn't have. It's an upgrade not a simple thing for beginners. Plus Apple are claiming it is faster than Obj-C. Whilst the truth of that claim has yet to be tested, it does indicate that Apple expects Swift to be used as the first choice for apps in the future - once it gets past beta.
I'm in the middle of a project now, and I expect to complete it using only Obj-C. But when I start on my next project, I expect to be using Swift, if it's considered solid enough by then.
But yes I don't think it'll be forced. Obj-C will still be there, because existing projects need it, and not everyone will want to learn a new language.
XAML is not relevant. Microsoft has come up with so many programming languages and API layers over the years. And XAML was a markup language for defining interfaces. Objective-C and the Cocoa (NextStep) APIs have been cnsistently the way to produce OSX (NextStep) apps since day one. Other than a half-hearted optional choice of Java for a while, this is the first change in primary language. And it's a general purpose language that supports everything that the proceeding language does. Which makes it different in every way from XAML.
Hurrah! Quoting works now!
(Score: 1) by mrMagoo on Tuesday June 03 2014, @10:04PM
Good points.
Well, in any case, I'll be larnin' up on Swift, and I'm sure I'll be using it before long.
From what I could see, it seems to be aimed only at iOS.
I was also talking to one of my Windows chaps, and he said that XAML is not a lot more necessary for Metro stuff, so rumors of its death have been greatly exaggerated.
"Good judgment comes from experience. Experience comes from bad judgment." -Originally attributed to Nasrudin
(Score: 1) by mrMagoo on Tuesday June 03 2014, @10:27PM
Make that "NOW", not "NOT".
"Good judgment comes from experience. Experience comes from bad judgment." -Originally attributed to Nasrudin
(Score: 0) by Anonymous Coward on Tuesday June 03 2014, @07:39PM
No semicolons? Wow, they caught up with REXX from the 1980s! The degree of innovation takes my breath away.
(Score: 2, Insightful) by Anonymous Coward on Tuesday June 03 2014, @07:46PM
What's sad about this is each corporation has a new language for its walled garden. Microsoft has their C# language and its mountain of libraries and frameworks. Apple has Objective-C and Swift and a mountain of libraries and frameworks. Google at least uses Java, which has applications outside of its walled garden.
Skills are no longer portable.
If there is a shortage of talent, why aren't we getting back to industry standards? Why does every corporation have to reinvent the same thing in a different way? Why isn't there one industry-standard static language like C++ and one industry-standard dynamic language like Swift, rather than every corporation having their own languages? Wasn't too long ago that stuff like GUI libraries were actual libraries that could be called from industry-standard languages like C.
And the job market wants incredibly specific, narrow, niche skills. They don't care if you have used every OO language since Smalltalk - if you don't have 5 years experience with a specific language like Objective-C, you might as well not even try to get a job, even if that language is trivial to learn. So if employers really wanted people with talent, they'd look for talent and let the talent absorb whatever the fad language of the week is.
(Score: 0) by Anonymous Coward on Tuesday June 03 2014, @08:42PM
"Why does every corporation have to reinvent the same thing in a different way?"
To fuck developers.
Next question.
(Score: 2) by DECbot on Tuesday June 03 2014, @11:40PM
public void developers(){
print("developers, developers, developers, developers, ");
// println("developers. God I love fucking over those developers, developers, developers, ")
developers();
}
cats~$ sudo chown -R us /home/base
(Score: 2) by DECbot on Tuesday June 03 2014, @11:53PM
sorry, forgot to overload my methods.
public String developers(Object ceo) throws Chair{
try{
print("developers, developers, developers, developers, ");
// println("developers. God I love fucking over those developers, developers, developers, ")
return developers(ceo);
} catch (Exception ex){
throw new Chair("developers, developers, developers, " + developers(ceo));
} finally {
print("developers, developers, developers, developers, ");
}
}
cats~$ sudo chown -R us /home/base
(Score: 2) by BasilBrush on Tuesday June 03 2014, @10:35PM
If there was an existing open standard language that did what Apple wanted, they'd no doubt have used it. They've created their own because there wasn't such a language already.
Why is Apple different? Because they need a language that is a good progression from Objective-C. Such that it will both seamlessly interoperate with existing code, and use patterns that are familiar to Obj-C coders. No other company has the same requirement.
Hurrah! Quoting works now!
(Score: 0) by Anonymous Coward on Wednesday June 04 2014, @05:28AM
All you're saying is that this is just a generation 2 lock-in language. And instead of Apple doing a u-turn regarding their bad behavior they're doing more of the same. What a surprise.
Developers get fucked as pointed out and on the bigger scale society.
(Score: 2) by BasilBrush on Wednesday June 04 2014, @03:31PM
I'm a developer, and I'd much rather program for OSX and iOS than other platforms. So I'm certainly not fucked. If you believe in cross-platform development you might have a different idea. But I don't.
Hurrah! Quoting works now!
(Score: 2, Insightful) by Tramii on Tuesday June 03 2014, @10:47PM
I'd looking forward to seeing a bunch of new job entries in the next few months that read:
"Senior Swift Developer Wanted. Min. 5 years experience required!"
(Score: 3, Interesting) by gringer on Wednesday June 04 2014, @05:33AM
If you're coding in R, the following is a valid program:
[producing the usual output of "hello world" printed to the console, but without a line break]
Semicolons are also optional in R, and won't affect program flow, but I prefer using them.
By the by, the following is also a valid R program:
And will produce as output (assuming it is the last statement in the code) a single string value:
I notice someone else posted some odd Swift code above that involved an implicit function definition. Here's the R version:
Maybe you want the value for a whole range of numbers:
Or a matrix:
And that's the same function. This is one of the many reasons I like prototyping in R. You get vectorisation with very minimal (if at all) code changes.
Ask me about Sequencing DNA in front of Linus Torvalds [youtube.com]