Why developers are going nuts over Apple’s new ‘Swift’ programming language

“When Apple unveiled a new programming language at its World Wide Developers Conference on Monday, the place went ‘nuts,’ erupting with raucous cheers and applause,” Cade Metz reports for Wired.

“WWDC is a gathering of people who build software applications for Apple hardware devices — from the iPhone and the iPad to the Mac — and with its new language, dubbed Swift, Apple is apparently providing a much faster and more effective means of doing so, significantly improving on its current language of choice, Objective-C,” Metz reports. “With something that Apple calls an ‘interactive playground,’ Swift is even exploring a highly visual kind of programming that may go beyond other mainstream languages. All those developers went nuts not only because they love Apple, but because the new language could make their lives that much easier.”

“Swift seems to extend Apple’s split with the rest of the software development universe. Many coders would prefer that Apple shift toward tools that would also let them build software for other machines from other vendors. But Tim Cook and company are traveling in the opposite direction. ‘Swift has all the right check boxes, but do we really need something that’s proprietary to Apple’s platform?’ asks programming guru David Pollak. ‘Yes, it solves a lot problems, but it’s yet another way to drive a wedge between iOS development and everything else,'” Metz reports. “This is unlikely to harm Apple any time soon. In fact, the company prefers things this way. It insists on defining its own rules, and its devices are so widely used, it knows that large numbers of developers will happily build apps for them no matter what language this requires, driven by the enormous dollars signs they see in names like the iPhone and the Mac.”

Much more in the full article here.

MacDailyNews Take: The more time developers spend in Apple’s superior and rewarding environment, the less time they have to waste slumming it with inferior operating systems.

Android et al. are destined to become even more afterthought backwaters than they are today.

Related articles:
Apple just delivered a knockout blow to Android with iOS 8 – June 2, 2014
Xcode 6 features resizable device simulators, paving way for iPhones with new screen sizes – June 2, 2014
WWDC 2014: Apple sets the scene for its next decade – June 2, 2014
Apple unveils new versions of OS X and iOS, major iCloud update with iCloud Drive – June 2, 2014
Apple’S WWDC news bores investors, not developers – June 2, 2014
Apple’s HealthKit aims to unite wearables and fitness apps – June 2, 2014
Apple releases iOS 8 SDK with over 4,000 new APIs – June 2, 2014
Apple unveils iOS 8, the biggest release since the launch of the App Store – June 2, 2014
Apple announces OS X Yosemite for Macintosh – June 2, 2014

35 Comments

  1. For people who likes to capitalize in what other people does (investors community), apple just announced more software updates.
    For people who like to make things happen, this is huge, the biggest change in 20 years.

    1. Seems like Apple is comitting further to the walled garden, though in the world of extreme security risks, this may be the only long term strategy to keep the average developer’s work safer than as currently done.

      I’m not a current coder, but I can guess Apple has analyzed this choice to death and see’s it as needed to go into the world of complex interconnected devices with safety.

      Am I wrong?

      1. Theoretically, the Swift programming language is going to profoundly improve security specifically because it will prevent all buffer overruns, the prime cause of security holes in software.

        Today, safe memory management is a voodoo art known to few. Most software these days has crap for memory management, forcing a plethora of thumbs to keep the dike from leaking or falling down. It’s damned ugly. Programming without that worry would be a PROFOUND milestone. Hopefully Swift is it.

        (Recall that Java promised to have the same thing, but oops, it didn’t. Java is now the single most dangerous software on the Internet).

        1. No language or compiler is going to change the need to understand memory management. Its the same thing with writing fast code. Sure the compiler can optimize for you, but the guy who takes the time to understand the lower levels is going to know how to make the fastest code. That won’t change.

          Ada, C#, Haskell, Java, JavaScript, Lisp, PHP, Python, Ruby, and Visual Basic all have bounds checking and most of them have garbage collection vs. explicit reference counting.

          You can still hose up and create security flaws however. It just cuts down on the most common buffer flaws you might create.

          good stuff regardless.

            1. Oh brother don’t even get me started on priorities in software development. I could rant for days 🙂

              When i started programming in the 80s there was an actual focus on the profiling stage. It was considered a core part of the development process to profile your code and optimize it.

              At some point in the 90s the attitude became “Well it won’t be in use long” or “We can patch it later – ship it now!”

              We lost something when those attitudes took root.

  2. The devs are going nuts because they finally get modern programming language goodness like generics, type inference, generics, multiple return types and automatic garbage collection.

    xCode is a pretty darn good IDE but Objective-C was starting to feel long in the tooth.

    I shouldn’t be in Xcode having moments where I think “This would be easier using C# on a Microsoft platform!”

    1. I really do not understand what is so bloody difficult about Objective-C and what people are bitching about. I find it quite eloquent and easy to use. hmmm.

      1. Its not that its difficult or there is anything per say “wrong” with it.

        Its more a matter of using different languages and frameworks out there and then coming back to Objective-C and having it feel a bit out of date.

        Nothing fundamentally wrong with Objective-C at all. Swift just takes things in the right direction imho.

      2. I don’t find it difficult per se, but cumbersome for sure. And Objective C (although amazing when it came out), has been left behind by the features available in C# and Java. Also, Apple’s API’s seem to require immense amount of coding to do simple things, when compared to C#/Java. Swift’s focus on code reduction, and adding modern features (generics, type inference, etc.) will give Apple the potential to be in parity with C#/Java at some point in the future (hopefully). And I agree with Really, despite my Apple addiction. Regardless of the years I’ve spent with objective c, I’d still rather do it in c#.

  3. I’m almost half way through reading the swift language description provided by Apple. From that I can already say, this is great, it will be the default development language for iOS and OS X, this and plain C. Politely, Objective-C is being shown the door.

  4. Eight or ten years ago, after reading about Apple’s use of object oriented tools and advanced programming technologies, and also its unique strategic decision to provide a completely integrated product, and looking at the mess the other “open systems” approach was creating, I thought that Apple would be a good stock to buy. A few years later, after the stock had done well, I thought Apple’s strategy was even more obviously the wave of the future and it occurred to me that Apple would one day face the “monopoly” charge that Microsoft had faced years earlier. Microsoft had truly practiced the destructive actions of a bad monopolist. On the other hand, Apple was building a monopoly on virtuous products that people simply wanted more than anyone else. Apple has not purchased competitors and put their products out of existence. It has just continually made better mousetraps. Now, its vertical integration approach, and its incredible drive to hardware and software perfection, have made it an unbeatable force. The market always produces surprises, but Apple has so much competitive advantage in it integrated product, its superior hardware architecture, its superior software tools, its unique interaction among all its devices, its growing ownership of core technologies (Liquidmetal, the pearl stuff), its economies of scale, and its totally loyal customer base. All these things tell me Apple will be an effective monopoly within about 2-3 years.

    1. Monopolies are not bad in and of themselves. Monopoly “abuse” is the bad part. Microsoft abused their monopolistic position over hardware vendors while deliver a so-so software experience for the world. Apple is not like that at all, as illustrated by yet another advanced forward in hardware and software perfectly, exactly as you stated. They are not driven primarily by market share as Microsoft is.

    2. (Rewrite)

      Monopolies are not bad in and of themselves. Monopoly “abuse” is the bad part. Microsoft abused their monopolistic position over hardware vendors while delivering a so-so software experience for the world. Apple is not like that at all, as illustrated by yet another advance forward in hardware and software perfection, exactly as you stated. They are not driven primarily by market share as Microsoft is.

  5. Oh I can see a certain company soon releasing a brand new copy uh a brand new programming language called Swill. No doubt they will be brought to court over it and rationalize that Apple has no copyright over the letter “a”, nor any of the other letters for that matter.

      1. Thanks for the heads up. I hope I’ll be all right, it’s been a while since I’ve had to deal with any malefic troll, the last one being Zune Thang who used to squirt points all over the place. Phew that was a tough one. But he’s fine now, although he still rolls around in the grave I hear.

  6. I love Apple and computer programming, but have never been fond of Objective C. Let’s just say it’s not as fun or intuitive to use as many other programming languages I’ve learned. From the first couple chapters of the Swift book, this language looks like exactly what I wish Objective C was from the beginning.

    One simple (kinda shallow) example: every single line of Objective C must end in a semicolon – which annoys me, and leads to frequent “missing semicolon” errors. Swift, like nearly every programming language invented in the last 20 years, does away with this useless syntax. You end a line of code in Swift by hitting enter and starting the next one, as it should be.

    1. “One simple (kinda shallow) example: every single line of Objective C must end in a semicolon – which annoys me, and leads to frequent “missing semicolon” errors. ”

      Uh…. that would be indicative of your own lack of attention to detail. PHP, of which Swift shares a very similar syntax, also uses semicolons at the end of lines. As a PHP programmer, I don’t struggle with that problem at all. It’s just part of my discipline. But I agree with the sentiment. I just find an end-of-line terminator to be a clear indicator of completion.

      1. It’s not a struggle – more of an eyesore and minor unnecessary annoyance than anything. When I use a language that requires semicolons, I remember to use them. I only run into missing semicolons when I’m debugging other people’s code – it’s a common beginner’s error. Where you conjuring “my lack of attention to detail” from?

        As a PHP programmer, I wouldn’t assume your main struggle would be remembering semicolons – I’d assume your struggle would be spaghetti coded webpages and MySQL injection attacks.

Add Your Feedback

This site uses Akismet to reduce spam. Learn how your comment data is processed.