Top 10 WWDC 2017 Videos
Wondering which WWDC 2017 videos are the best for developers to watch? Check out our recommended Top 10! By Tim Mitra.
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Contents
Top 10 WWDC 2017 Videos
25 mins
- 1) Platforms State of the Union – Session 102
- 2) What’s New in Swift – Session 402
- 3) What’s New in Cocoa Touch – Session 201
- 4) Core ML In Depth – Session 710
- 5) Advanced Animations with UIKit – Session 230
- 6) What’s New in CareKit and ResearchKit – Session 232
- 7) Designing Sound – Session 803
- 8) Vision Framework: Building on Core ML – Session 506
- 9) Debugging with Xcode 9 – Session 404
- 10) Updating Your App for iOS 11 – Session 204
- Where to Go From Here?
If you weren’t lucky enough to get a “golden ticket” to WWDC 2017, catching up by watching the videos will be quite a challenge, as there are over 130 WWDC session videos available this year!
There are videos on the newest APIs, such as ARKit, CoreML, Drag and Drop and Vision; ones covering Xcode 9 with new and improved refactoring in Swift, Objective-C and C++, and then there’s everything new in Swift 4, new bits in UIKit, great videos on accessibility and so much more.
What’s a developer to do?
Fear not, as the raywenderlich.com tutorial team and friends have assembled a list of the Top 10 WWDC 2017 videos that cover everything you need to know, in a minimum of time. We consider these “must-see” sessions for developers from all backgrounds and specialties!
$0.playbackRate = 1.4;
. You can thank me later! :]1) Platforms State of the Union – Session 102
If you only have time for one video, this is it!
For developers, the real start of WWDC is the Platforms State of the Union session. The Keynote is a fluffy offering to surprise and delight the general public, investors and Apple faithfuls. The State of the Union, in contrast, is where the really interesting details come out.
This talk surveys the new technologies and outlines which sessions will provide more details on each technology. Here are the highlights of the 2017 Platforms State of the Union:
- Xcode 9 has a brand new source editor rewritten in Swift. The new source editor is more semantic, has tokenized editing from Swift playgrounds as well as an integrated Markdown editor. The best feature? A brand-new refactoring system that supports Swift, Objective-C, C and C++.
- Swift 4 continues to make programing easier, faster, safer and more modern. String handling is vastly improved, with an easier API, greater integration with Unicode and sports a much more responsive UI. The Codable protocol handles encoding and decoding, which makes it possible to handle JSON in Xcode 9. You can blend Xcode 3.2 and Xcode 4 together in your projects, and compiling and blending Swift with Objective-C is now faster than before.
- CoreML makes adopting machine learning in your apps nearly a plug-and-play exercise. Choose a third-party model library, add it to your app and boom, you have a neural network. Add a few lines of code and you can create your own ballpark frank identifier.
- ARKit lets you create Augmented Reality apps on iOS. Using visual-inertial odometry, the framework makes use of the accelerometer and gyro to free up the CPU to track objects in the real world. With Scene Understanding, you can place objects into your AR scene and light them appropriately for increased realism. You can use graphics frameworks like Metal2 in ARKit and even use pre-existing plugins from Unity and Unreal.
There are many more new items covered in the Platform State of the Union than I can address in this article. If you watch no other WWDC 2017 session video, this is definitely the one you want.
2) What’s New in Swift – Session 402
“The What’s New in Swift is so dense this year you can cut it with a knife!” – Me, just now.
The session begins with a shout out to Oleg Begerman’s open source playground that you could use to test Swift 4 before WWDC. Because it’s open source, you can grab a snapshot from swift.org and add it to Xcode since the snapshot is a toolchain item. Xcode 9 now offers refactoring and the possibility to use the toolchain mechanism to roll your own refactoring.
This session is so dense that we can only cover some of the highlights:
- The
private
keyword has been redefined to reach across multiple extensions in your code while protecting other elements in the same source file. - Swift 4 introduces building a class with any number of protocols, just as you could in Objective-C. This allows you to code in a more Swift-y style with extensions.
- Source compatibility between Swift 4 and the new Swift 3.2 also has many refinements. As a result, the Swift 4 compiler has a compilation mode that emulates Swift 3. Since it understands the differences between the two, you can update your target to Swift 4 and leave your dependencies in Swift 3. Migrating to Swift 4 is no longer the end of the world and can be done as your development schedule permits.
- Xcode 9 has a new more efficient build system, based on llbuild, that you can opt in to and try out.
- Precompiled Bridging Headers mean that a single header is compiled and shared by every Swift file in your project. Previously, it had to be recompiled for every Swift file.
- Code coverage has improved: you no longer have to compile the entire project to run a test. Now the project is built once and saved.
- Existential copy on write buffers are only used when values are mutated; otherwise, the values are read-only. Previously, large items were moved the more expensive heap, creating a huge performance hit.
- The Swift 4 compiler now can automatically strip away conformances that aren’t used.
- Strings are now Collections instead of arrays of Characters. This means that you can use collection methods like
.split
. However, creating splits can create a strings’ owner reference to the original string. To avoid a double reference or possible memory leak, Substring is now a type, with the same behaviors as strings. - Multi-line literals are a new feature proposed and maintained by the open source community. To use them, simply enclose your multi-line text with triple quotes.
Exclusive Access to Memory makes it easier to deal with local variables and enable programmer and compiler optimizations, since properties sometimes need to be protected during operations. While it may be fine for a variable to be read by two separate processes, writing to the variable should be an exclusive operation. With this new rule in Swift 4, the complier will tell you when this occurs, on a single thread. The new Thread Sanitizer Tool will tell you when this occurs in multi-threaded cases.
3) What’s New in Cocoa Touch – Session 201
Presented by Eliza Block and Josh Shaffer, What’s New in Cocoa Touch is a rapid-fire overview of new productivity, UI refinements and new APIs. Like the Platforms State of the Union, this session leads the way into other, in-depth sessions. Eliza gives a brief overview of adding drag functionality, moving items around the screen and finally dropping, where your app receives the dropped data. Drag and Drop is relatively easy to deploy, as many existing frameworks have the hooks in place already to handle this.
“What’s new in Cocoa Touch is a great overview of a lot of the changes on iOS” – Ellen Shapiro
Document management is also further refined by file management across Cocoa Touch. Based on UIDocumentBrowserViewController, files can now be accessed independently and stored in folders. In fact, files from one app may be even accessed from other apps. There is an understated push to make the iPad and larger iPhones more flexible though these and other refinements.
Josh Shaffer covers the new dynamic large titles as part of the UI refinements. The large prominent title, reminiscent of the News app, lives inside a larger header bar. As you scroll down the page, the header shrinks to the familiar style and size. The Safe Area creates a buffer space around the edge of devices. This clears the edges for gestures, creates a cleaner look and most importantly aids with overscan buffer, which is important for tvOS devices. And even better, UIScrollView no longer fights with your contentInsets! The Design Shorts 2 session and Updating Your App For iOS 11 have more info on these.
Eliza returns to cover a few new things in Swift 4, such as the new KeyPath type with its new literal “\” that eases and provides clarity in the new block-based KVO. She also covers the Codable protocol that enables objects to be archived and unarchived, and which enables native JSON encoding in Cocoa Touch apps.
Working with Dynamic Type, which is essential for accessibility, is now easier with the new UIFontMetric objects. Auto Layout works with Dynamic Type to aid the system sizing your fonts. Password autofill is also covered in brief.
This session gives you enough information to speak clearly about new features in Asset Catalogs, PDF-backed images, ProMotion’s support of higher refresh rates on the latest devices.
4) Core ML In Depth – Session 710
Machine Learning is clearly a hot topic these days and Apple has made it easy to add this technology to your apps.
With Core ML, you can consider machine learning as simply calling a library from code. You only need to drop a Core ML library into your project and let Xcode sort everything else out. In this session, Krishna Sridhar and Zach Nation give overviews of the types of use cases for machine learning in your apps.
“It was good, lots of attention of augmented reality and machine learning. Especially when Apple made machine learning plug and play. All you need is to find a model you can use or worry about training your own. Everything else just works! “ – Vincent Ngo
You can put Core ML to work with handwriting recognition, credit card analysis, sentiment analysis with text input and gesture recognizes. Krishna demonstrates how Core ML has a natural way of dealing with numerics and categorical input. You can also use Core ML with the Natural Language Processing (NLP) to determine the mood of the user by processing text.
The speakers cover the hardware optimization of Core ML along with Core ML Tools that let you convert and work with popular machine learning formats. You also don’t need to sort out whether your project will use the CPU or GPU. Core ML takes care of that for you.
Zach Nation demonstrates how to use Apple’s open-sourced Core ML Tools, which are a set of Python scripts to import common machine learning formats and convert them to Core ML library format.
“CoreML and related are fantastic – great technology and easy to use.” – Mark Rubin
I’m also awarding a “honorable mention” to Introducing Core ML [https://developer.apple.com/videos/play/wwdc2017/703/] which also ranked well. It’s further proof that Core ML seems to be the runaway topic of WWDC 2017!
5) Advanced Animations with UIKit – Session 230
Joe Cerra takes you through some basics on UIAnimations, with the aim to help you make your animations interactive and interruptible. In 2016, Apple introduced UIVIewPropertyAnimator that enables you to do just that. With this framework, you can give your animation customized timing as well as update them on the fly. Joe walks though how to adjust timings to create more interesting effects.
Joe demonstrates several enhancements to a simple demo animation with a pan gesture recognizer, including .PauseAnimation
to pause and .ContinueAnimation
to continue moving an object in the view. Midway through the talk, he demonstrates how to combine a number of tap and pan gestures along with animation properties to create an interactive and interruptible experience. Building on the effect, he adds in new behaviors, linear and nonlinear scrubs, pausing, and uses springs to add realism and damping.
Using the UIVisualEffectView, Joe combines blur and zoom to create compelling effects that he terms “view morphing”. The final reveal involves new properties for corner radii and masked corners. There are plenty of great tips and tricks covered in the session – way more than can I can fit here in a few paragraphs.
6) What’s New in CareKit and ResearchKit – Session 232
Samantha Mravca refreshes viewers on ResearchKit and CareKit, as well as how they combine to sit on top of Apple’s HealthKit framework. ResearchKit allows institutions to build tools for gathering medial information and share that data with other HealthKit apps. CareKit, introduced in 2016, enables users to play an active role in their health.
The session covers some new features and the CareKit prototyping tool. There are some really interesting widgets and controls in CareKit to display progress, collect stats and capture optional and read-only data. I wonder if these widgets could find a place in other types of apps.
The speakers covered some existing data collection, or “active task” examples such as hearing tests, Stroop focus tests and cognitive tests such as trail making tests for visual attention. New modules include range of motion tests that make use of the accelerometer and gyro to test the motion of shoulder and knees.
CareKit now combines the user’s health data and symptoms into Care Contents. CareKit also includes some ready-to-use glyphs for iOS and watchOS. New this year are threshold measurements including numeric and adherence thresholds.
The session also covers the CareKit prototyping tool, which is targeted at non-technical builders who want to leverage the prototyping tool. Ultimately, these tools are designed for health professions and involve a minimal amount of coding in some cases none. Health care is a fascinating subject that we all have a vested interest in.
7) Designing Sound – Session 803
Apple sound designer Hugo Verweji invites attendees to close their eyes as he takes the viewers on an aural journey through a forest, then into light rain and finally a thunder storm. Sound, he says, has a magical ability to create emotions. This session takes the audience through various soundscapes and demonstrates that sound is an integral part of our experiences with our apps and devices.
Sound can warn us; sound can convey a person’s calm or haste. App design doesn’t end with how the app looks. Using sound in apps helps shapes the experience the developer is trying to convey. Sounds attached to notifications can indicate “look at me!”, “time to wake up”, or “oops, your Apple Pay transaction failed”.
He demonstrates how sound can be used in the Toast Modern app, which is a timely demonstration as hipster toast sweeps through the Bay area. Hugo continues with a special set where he shows how some of the popular and familiar sounds in iOS were created. Sorry, I won’t give any spoilers here — you’ll have to watch it yourself! :]
Haptics combine with sound to provide a rich experience to what we see, hear and feel on our Apple Watches and iPhone 7s. The session also covers sound design to create different feelings for different tones.
This session is for more than just musicians and sound designers; it’s a must see even if you’ve never thought about sound in your app before. If you do nothing about sound, you’ll be stuck with the default sounds, and you’ll miss the opportunity to make your app stand our and to be in line with your branding.
Hugo also reminds us that silence is golden. Use sound sparingly, and offer to turn off sounds altogether. Whatever you do, ask yourself, “What do I want people to feel when they use my app?”
8) Vision Framework: Building on Core ML – Session 506
In this session Brett Keating describes what you can do with Vision framework. Face Detection with deep learning and optionally combined with Core ML promises some interesting enhancements. There’s better detection and higher recall, which enables you to recognize smaller faces, strong profiles and even obstructed faces.
Image registration will let you stitch together separate images by using common landmarks, and features like rectangle detection and object tracking are now more refined.
Combined with CoreML and computer vision, you won’t have to any heavy lifting to implement the Vision framework in your app. The framework will tell you where the faces are, and Apple will take care of the rest. In Apple’s words, Vision provides a “high-level device to solve computer vision problems in one simple API.”
He also discusses the benefits of on-device image processing versus cloud-based processing. By keeping processing on on the device, you can retain the privacy of your user’s data. The cost of cloud based services is also a factor, as the cost to use cloud based services may affect the developer and the user. The low latency of device-based processing is also an advantage.
Frank Doepke then takes over the talk and delves into some practical demos of the Vision framework. He explains that it’s a matter of making requests, handling requests and viewing the results. You can use basic settings, and feed in a single image or a series of images. You can also use Core Image if that’s how you roll. Dropping in a Core ML model lets you further refine the tasks your app performs, such as object recognition. In the last demo, he makes use of MNISTVision, popular in the machine learning community. With this he’s able to categorize, straighten and recognize hand written characters.
This is a great session if you’re interested in computer vision. Throw in Core ML and Core Image and you can create the next great hotdog-detecting app.
9) Debugging with Xcode 9 – Session 404
“Debugging is what we developers do when we’re not writing bugs.” – Me, again.
I spend an awful lot of time setting breakpoints, looking a debug logs, playing in the View Debugger and Memory Graph debugger. Any session on debugging is my favorite session.
Wireless Development is the first topic covered in this session. The Lightning cable is no longer required — yay! Working untethered definitely aids in ARKit and tvOS development, managing other accessories plugged into the Lightning port or even when you’re just kicking back on the couch. Connecting is straightforward on basic networks, Apple TV and corporate networks. This new capability is demoed working wirelessly with the accelerometer.
Time Profiler now has an “All” button that lets you view all the active threads in your app. You can also pin one thread and compare as you scroll through the other threads.
Breakpoint debugging with conditions are now easier to work with, and code completion is now included in the Breakpoint editor. Additionally, breakpoints with options now have a white triangle indicator for easy recognition. A tooltip is also available to see what options are available.
In the View Debugger, view controllers are now included in the view tree as the parents of the views. They are also indicated in-canvas with a banner making it easy to find the view controllers. View Controllers can also be selected and reviewed in the inspector.
The View Debugger lets you inspect SpriteKit views so you can debug sprites and views. Apple has included the SceneKit Inspector to edit your scene and debug it in runtime debugging mode. The entire scene graph can be explored and additionally saved as a snapshot.
“We use our debugger to debug our debuggers.” – Chris Miles.
The Memory Graph Debugger is actually built in SpriteKit. In the demo, the presenter opens another Xcode and debug it in another copy of Xcode. Finally Sebastian Fischer demos the new enhancements in Xcode 9 debugging.
10) Updating Your App for iOS 11 – Session 204
You’ll definitely want to watch this session, unless you’re one of those mythical developers who has never wrestled with your app layout.
Up first — UIKit. The navigation bars and tab bars as children of UIBarItem can now apply the new landscape tab bar, which is slightly smaller with a title and icon side-by-side. Turning on Large Titles in the navigation bar is as easy as setting a property and adopting largeTitleDisplayMode. A new UISearchBarController now houses new style of search bar in the header, and can scroll away to hide under the header.
Navigation Bars and Tab Bars now support Auto Layout. These items provide their positions, and you provide the sizes. The former layout margins are now actually minimums, and TopLayoutGuide and bottomLayoutGuide are now deprecated by SafeAreaInsets. Layout margins now have Directional Layout Margins that apply to leading and trailing constraints. You can also decide to override the properties altogether and have full-screen content.
TableView headers and footers are now self-sizing in iOS 11. If you’re not ready for this, you can easily override this behavior by setting the estimated sizes to zero. In iOS 11, the TableView separator insets inside this region are now relative to the edge of the cells and the full width of the screen. UITableView and UITableViewHeaderFooterView now have content views that respect the safe area insets as well.
If you are eager to adopt the look and feel of iOS 11 in your apps, this should definitely be on your watchlist.
Where to Go From Here?
In summary, here are our picks of the top 10 WWDC videos to watch:
- Platforms State of the Union
- What’s New in Swift
- What’s New in Cocoa Touch
- Core ML in Depth
- Advanced Animations with UIKit
- What’s New in CareKit and ResearchKit
- Designing Sound
- Vision Framework: Building on Core ML
- Debugging with Xcode 9
- Updating Your App for iOS 11
Thanks to contributors: Ellen Shapiro, Kelvin Lau, Kevin Hirsch, Sam Davies, Kiva John, Caroline Begbie, Mark Rubin, Matthijs Hollemans, Vincent Ngo, and Jaime Lopez Jr!
What do you think are the “don’t miss” videos of WWDC 2017? Tell us in the comments below!