from the hello-world-of-networking dept.
Software engineer, Dave DeLong, has written an 18-part series on building an HTTP framework in Swift. Apple's Swift programming language is a general-purpose, open source, compiled programming language intended to replace Objective-C. It is licensed under the Apache 2.0 license. In his series, Dave covers an Intro to HTTP, Basic Structures, Request Bodies, Loading Requests, Testing and Mocking, Chaining Loaders, Dynamically Modifying Requests, Request Options, Resetting, Cancellation, Throttling, Retrying, Basic Authentication, OAuth Setup, OAuth, and Composite Loaders.
Over the course of this series, we've started with a simple idea and taken it to some pretty fascinating places. The idea we started with is that a network layer can be abstracted out to the idea of "I send this request, and eventually I get a response".
I started working on this approach after reading Rob Napier's blog post on protocols on protocols. In it, he makes the point that we seem to misunderstand the seminal "Protocol Oriented Programming" idea introduced by Dave Abrahams Crusty at WWDC 2015. We especially miss the point when it comes to networking, and Rob's subsequent posts go in to this idea further.
One of the things I hope you've realized throughout this blog post series is that nowhere in this series did I ever talk about Codable. Nothing in this series is generic (with the minor exception of making it easy to specify a request body). There is no mention of deserialization or JSON or decoding responses or anything. This is extremely deliberate.
The point of HTTP is simple: You send an HTTP request (which we saw has a very well-defined structure) and you get back an HTTP response (which has a similarly well-defined structure). There's no opportunity to introduce generics, because we're not dealing with a general algorithm.
So this begs the question [sic]: where do generics come in? How do I use my awesome Codable type with this framework? The answer is: the next layer of abstraction.
HTTP, or at least through version HTTP 1.1, is a straight forward, text-based, human-readable specification.
(2019) Apple Patents Programming Language Feature
(2017) Undefined Behavior != Unsafe Programming
(2016) New Swift Release Has Port to Ubuntu
(2015) Apple's Swift Language Might Actually be Good
(2014) Apple Aims to Speed up Coding with its own Swift Programming Language
« ‘NO’: Grad Students Analyze, Hack, and Remove Under-Desk Surveillance Devices Designed to Track Them | Amazon Offering Customers $2 Per Month for Letting the Company Monitor the Traffic on Their Phone »
Apple surprised the audience at its Worldwide Developers Conference in San Francisco on Monday with a tool that few attendees expected: a new programming language for iOS and OS X development called Swift (https://developer.apple.com/swift/). There already is a programming language called Swift (http://swift-lang.org/main/) that was developed by the National Science Foundation, some other government agencies, and the University of Chicago for use in parallel computing applications. This isn't that. What it is, is an entirely new syntax that -- in the words of Apple senior VP Craig Federighi, who unveiled it during the Monday morning WWDC keynote -- aims to be "Objective-C without the baggage of C."
Some of that "baggage" will already be familiar to developers who cut their teeth on C but later moved on to scripting languages such as Python (and Federighi compared Swift to Python several times during his presentation). Like scripting languages but unlike C, Swift lets you get straight to the point. The single line println("Hello, world") is a complete program in Swift. Note, also, that you don't even have to end the statement with a semicolon, as you do in C. Those are optional, unless you're combining multiple statements on a single line; i.e. a semi-colon is a statement separator rather than a statement terminator.
In addition to its online documentation, Apple has released an e-book, The Swift Programming Language, that's a free download (https://itunes.apple.com/us/book/the-swift-programming-language/id881256329) from the iBooks Store. To start working with the language itself, you'll need to download the beta release of XCode 6 (https://developer.apple.com/xcode/downloads/), which includes tutorials to get you going.
The hype around Swift is near non-existent by Apple standards, yet the language has attracted high praise since its release last year. Swift is essentially one of the very few Apple products representing a clear departure from the hardware-led approach Steve Jobs took to the business. If Stack Overflow's 2015 dev survey is anything to go by, it looks as if the Swift language might have potential to really shake things up.
Might the days of Apple programmers relying upon objective C be numbered?
ZDnet reports that, concurrent with the release of version 2.2 of its implementation of the Swift programming language, Apple Inc. has made available a port of the compiler, standard libraries, debugger and REPL (a CLI) to the Ubuntu operating system. The port does not include the core libraries, which the company says are not suitable for production use. The language is frequently used in software intended to run under Apple's OS X and iOS operating systems.
John Regehr, Professor of Computer Science, University of Utah, writes:
Undefined behavior (UB) in C and C++ is a clear and present danger to developers, especially when they are writing code that will execute near a trust boundary. A less well-known kind of undefined behavior exists in the intermediate representation (IR) for most optimizing, ahead-of-time compilers. For example, LLVM IR has undef and poison in addition to true explodes-in-your-face C-style UB. When people become aware of this, a typical reaction is: "Ugh, why? LLVM IR is just as bad as C!" This piece explains why that is not the correct reaction.
Undefined behavior is the result of a design decision: the refusal to systematically trap program errors at one particular level of a system. The responsibility for avoiding these errors is delegated to a higher level of abstraction. For example, it is obvious that a safe programming language can be compiled to machine code, and it is also obvious that the unsafety of machine code in no way compromises the high-level guarantees made by the language implementation. Swift and Rust are compiled to LLVM IR; some of their safety guarantees are enforced by dynamic checks in the emitted code, other guarantees are made through type checking and have no representation at the LLVM level. Either way, UB at the LLVM level is not a problem for, and cannot be detected by, code in the safe subsets of Swift and Rust. Even C can be used safely if some tool in the development environment ensures that it will not execute UB. The L4.verified project does exactly this.
Apple's mobile phone language Swift has some sort of "optionals chaining" that Apple finds novel enough to patent.
From the discussion, it appears Apple is intentionally using an Apache 2 license to ensure that access to this feature remains freely available. (Insert obligatory IANAL disclaimer.) Any Soylentils care to weigh in?
(Score: 1) by shrewdsheep on Thursday December 08, @03:30PM
Is this story a hint about rehash needing an overhaul?
(Score: 3, Interesting) by RamiK on Thursday December 08, @03:48PM (1 child)
Swift might be acceptable compared to C/C++ but the lack of thread safety makes it unfit for the job. Especially so in a space where Go and Rust would get you better results with a special emphasis on Go for having 73% of its use-cases being "API / RPC services" and a 93% satisfaction rating: https://go.dev/blog/survey2022-q2-results [go.dev]
And there's something to be said about Java too here for having so many existing solutions... But assuming you want to roll your own, Go is the most natural choice by far.
(Score: 1, Informative) by Anonymous Coward on Thursday December 08, @06:16PM
I recently wrote a web service in Go (first time I had touched web programming in 20 years). It was ridiculously easy, and ridiculously fast. Threading was trivial. Thanks to it being a "real" compiled language, by the time it compiled cleanly it worked perfectly.