page-2--_post-4
Recent Advances in Differentiable Swift
page-2--_post-4

The Swift programming language is a key tool in the construction of PassiveLogic’s digital twins and autonomous control systems. We chose it in large part because it has a capability unique among systems languages: language-integrated automatic differentiation. For an introduction to differentiable Swift and why it is an incredibly powerful tool for optimization, I highly recommend reading my coworker Porter Child’s five-part series:

Differentiable Swift is an experimental language feature that is currently being pitched as part of the Swift Evolution process in “Differentiable programming for gradient-based machine learning”. The original design grew out of the Swift for TensorFlow project at Google, and is now being maintained and advanced by the community. I’m a former member of the Swift for TensorFlow team, and am proud to help carry on that work here at PassiveLogic.

As described by Porter in his aforementioned series of articles, differentiable Swift enables us at PassiveLogic to perform gradient descent through strongly-typed physics models and more. Because of the importance of this language feature to our products, we’ve invested in continuing to move it forward.

Over the past year, significant work has gone into identifying, reproducing, and fixing bugs in differentiable Swift that stood in the way of specific applications. As a result of that effort, we at PassiveLogic are now able to deploy all of our Swift-based simulation and control software into production. This is an important milestone for us on our way to deploy truly autonomous control systems for buildings.

These patches have been a community effort. Richard Wei as one of the primary authors of differentiable Swift has continued to drive the design and stabilize the interface. I’d personally like to thank Dan Zheng, another of the primary authors, for his ongoing advice and reviews. At PassiveLogic, we’ve partnered with Access Softek, Inc. and Anton Korobeynikov there has done an excellent job in upstreaming patches for many of the issues we’ve encountered. I’d also like to make special mention of the work that Philip Turner has done to isolate reproducers and to upstream tests and some fixes for issues that he’s observed while updating the Swift for TensorFlow APIs for current Swift toolchains.

In an effort to keep the open source Swift community up to date on new developments in differentiable Swift, I’ve started a thread in the Swift forums. I’ve kicked it off with a list of many of the patches that have gone into the Swift compiler for this feature over the last year. If you’re interested in differentiable Swift, I highly recommend browsing through that long list to see what’s been updated over the last year. As the work continues, I’ll do my best to add them in that thread.

Looking forward, now that differentiable Swift is working well for us in production, we’re going to shift our efforts into making it as fast as can be. Kevin Vigor hinted at this in his article about using Enzyme autodiff with Swift, but we believe that we have a path toward improving differentiable Swift’s host-side performance by two orders of magnitude for many common cases over the next year.

If working within the Swift compiler sounds exciting to you, or you want to try your hand at differentiable programming in practice, we’re hiring!

Previous post
5 / 5
Twitter%20X