Everything You Ever Wanted to Know About Core Haptics

By Gernot Poetsch

Posted on November 1, 2019 in Technology Spotlight

The iPhone Taptic Engine and speaker module - both used by Core Haptics.

Ever since mobile phones fit into our pockets, they have used haptics to alert us. Until recently, this meant a dull vibration, usually caused by a rotating motor with an off-center mass. These rumbling motors soon delivered more than just notifications of incoming calls: the Nintendo N64 had a Rumble Pak, and joysticks had Force Feedback. However, aside from these simple forms of tactile feedback, our eyes and ears were usually the only senses targeted by our devices. That changed when linear actuators replaced the rotating motor with what is the core of a traditional loudspeaker - without the audio: a comparably heavy mass, held by springs, and moved linearly using magnets and coils.

With it came new capabilities: analogous to audio, it’s possible to vary the frequency and amplitude independently and to have the mass oscillate continuously or suddenly stop. A linear actuator can be used as a silent wearable subwoofer for music such as the Basslet from Lofelt, or to create very subtle, supporting vibrations that feel like clicks, evident in every modern Apple Trackpad. Recent iPhones feature numerous touch gestures that are supported by haptics. Game controllers like the Joy-Con of Nintendo’s Switch or the upcoming Playstation 5 Controller also upgraded to linear actuators. Some modern cars take advantage of haptic feedback with extensive use of touchscreens in their dashboards so that you don’t need to take your eyes off the road.

The iPhone Taptic Engine is the most broadly deployed haptic platform by far, so it’s time to look at how to create content users can feel. With iOS 13, Apple finally introduced the Core Haptics framework that enables developers to create a fully custom experience that leverages everything the Taptic Engine has to offer.

This article is in two parts. The first part describes Core Haptics and the fundamentals of the technology. The second part takes a more in-depth look at how Core Haptics works and gives an overview of Lofelt Composer, which is the missing piece of the puzzle: an intuitive tool to design the complex patterns that make a good custom haptic experience.

So what is Core Haptics? And, should I use it?


Apple introducing Core Haptics at WWDC 2019. The session video is available here.

If you are developing an iPhone app, you should not pass on the opportunity to enhance it with haptic feedback. If you only need predefined well-known clicks and alerts, or if you target pre-iOS 13 versions, UIFeedbackGenerator might provide everything you need.

However, if you are developing an immersive game, a reactive music app, or some other immersive content where you need direct control over what your user feels, Core Haptics is your best choice.

Core Haptics was released in iOS 13 and is supported on the iPhone 8 or newer. Sure, iPads don’t have a Taptic Engine, but the Watch does, and so do Apple Trackpads and who knows what else Apple has in mind. While Core Haptics is currently not supported beyond the iPhone, the framework is built to account for all this possible variability. To make it safe to use (after all, it’s a mechanical device, with quite some power to move around), the framework does not operate on waveforms but introduces some abstract concepts to represent its content. This abstraction allows it to be more adaptable to different hardware – and your custom haptics are more likely to leave your (and your users) expensive new toys in one piece.

Everything is made of just two building blocks

A primary new concept of Core Haptics that makes it different from waveform-based audio is that its content is abstracted into Haptic Events. These are used internally to calculate the movement that the linear actuator then performs. Core Haptics has two main types of Haptic Events: transient and continuous.

A transient event is a click. It is brief, spiky, and distinctive. A single one stands out, and too many events, too close in a row can’t be felt anymore (or the hardware won’t play them back).

A continuous event is a vibration. Think of it like a motor, but more sophisticated.

We measured the continuous and transient Core Haptic events using an accelerometer attached to an iPhone 8. The results are very close to Apple’s event representation.

Both types of events have parameters, the main ones being sharpness and intensity, which can be modified in various ways that we’ll talk about later. If you want to get hands-on experience, download the Apple sample code for Haptic Palette and play around with it for a while.

How Haptics are specified

Transient and continuous events are the only available building blocks to construct more complex patterns. Multiple events can be composed into a complex structure and be further refined by modifying parameters and parameter curves.

Events are defined with CoreHaptic’s CHHapticEvent, which has three required properties:

  • The type, which is a CHHapticEvent.EventType: hapticContinuous, hapticTransient, as well as two audio types audioContinuous and audioCustom — more on these later.
  • A relativeTime for the start.
  • A duration.

Also, there can be an array of optional parameters in eventParameters. Those parameters refer to a CHHapticEvent.ParameterID and, of course, have a value:

  • hapticIntensity refers to the strength of the event. It translates to the amplitude of the haptic waveform. Analogous to audio, the scale for this parameter is logarithmic, not linear.
  • hapticSharpness has a different effect based on the type of event. In transient events, it could be described as the “clickiness” of the feedback, and for continuous events, this parameter controls the “speed” of the vibration (i.e., frequency of the vibration).
  • attackTime, decayTime, sustained and releaseTime refer to the classic Attack/Decay/Sustain/Release (ADSR) envelope that’s common in music synthesizers. Here it’s used to provide more fine-grained control over the haptic event.
  • audioVolume, audioBrightness, audioPan, and audioPitch can modify supporting audio.

Events can be specified directly in code as CHHapticEvent objects – or supplied as dictionaries, and the framework generates CHHapticEvent objects from there. In this case, missing values are filled in with defaults.

CHHapticEvent objects are fully Codable compliant. This compliance means that objects can be transformed to and from JSON without much additional code. Apple has also defined a new file format to specify a pattern: AHAP, the “Apple Haptic and Audio Pattern, ”which is a fancy name for a serialized JSON file of this data structure. As haptic patterns can become very complicated, a file format and the ability to play that file back comes in very handy.

An example of an audio pattern and a corresponding haptic event pattern: a sharp transient event, followed by a continuous event with a soft transient.

What about Audio?

The relation between Core Haptics and audio is a curious one. On one side, the Taptic Engine is separate from the audio subsystem: Core Haptics schedules its playback to a specified time or “as soon as possible.” But for mechanical and system reasons, there are limits to what can be done – it’s not as precise as system audio. On the other hand, audio plays a huge part in how haptics are perceived. In the real world, there is rarely haptics without audio, and even if it’s so subtle that it can’t be consciously heard, even a tiny bit of audio can make a huge difference. If you have an Apple Watch, and you usually have it on mute, try to set it to almost silent instead, so you barely hear it’s on. You’ll notice the audio makes a difference to how the haptics feel. However, while the haptics might be independent of Core Audio, and you don’t have the high-priority real-time precision to schedule them, the accompanying sound must fit the vibrations perfectly. A few milliseconds off is noticeable. Because of this, audio can be included in the haptics pattern, and the framework ensures that it’s perfectly in sync with the movement that the user feels.

Even though haptics are not affected by the “AudioSession,” a set of rules for audio playback that is granted to an app by the operating system, the accompanying sound is. If the iPhone ringer switch is off, the accompanying sound – even a very subtle, barely audible click – is also off. iOS’s Audio Session is a topic of its own: in general, an Audio Session is provided to your app by the system. It can be taken away or interrupted - and which category of session your app chooses determines if your sound is heard. Here’s a good introduction to the topic.

Introducing Lofelt Composer

Now with all this theory about continuous and transient base types, how does one get more complex patterns like the vibrations of a combustion engine, a heartbeat, or a drum kit? This is where Lofelt Composer comes in: you can supply a waveform file, and it creates a matching AHAP, ready to play in your app. Lofelt Composer is based in the browser for pattern editing but comes with a companion iPhone app that lets you play your creation immediately. We’ll cover Composer more in the second part of this article, but it makes sense to play around with it a bit to get accustomed to how it works. Composer makes heavy use of curves to modify the sharpness and intensity of the haptics, how this works under the hood is covered in the second part.

Lofelt Composer for iOS.

How it all comes together in Code

Now that we covered the basics, and we have a good source for AHAP files, it’s time to try it in code. Fortunately, Core Haptics code is very straightforward, and there are only a few necessary steps to get it going.

  1. First, you’ll need to check if Core Haptics is available. You can use CHHapticEngine.capabilitiesForHardware().supportsHaptics for that, and it makes sense to save the value as a Bool early on, to speed it up.
  2. Create a CHHapticEngine and retain it somewhere. Engines are not singletons, so you can create more than one, and, for example, attach one to each ViewController that needs it.
  3. If you need it for more than a few simple effects, create handlers for recovering from failures and stopping. Then, start() the engine.
  4. Create a CHHapticPatternPlayer or, if you want to play from a file, a CHHapticAdvancedPatternPlayer using the HapticEngine’s makePlayer() or makeAdvancedPlayer() method.
  5. Start playing the haptics by using start(atTime:)on your player. A time of “0” starts immediately.

A few things to keep in mind: the engine needs to be retained to work, but the players are ”fire and forget“ and continues playing after release, so there’s usually no point in building complex structures to keep them around. Players are also cheap to make, so one approach would be to have one Engine per ViewController, and multiple Players if you have multiple patterns to mix.

In the next episode

We cover a workflow with Lofelt Composer and look under the hood of the AHAP file format. You’ll learn additional ways to modify haptic patterns with curves and parameters and make them dynamic so they can be adapted to your apps’ needs at runtime. We’ll also share more of what we learned about Core Haptics along the way.

Gernot Poetsch — Lofelt
Gernot Poetsch

Gernot Poetsch started as a Mac developer, and is building iOS apps ever since the iPhone exists. With his company nxtbgthng (nxtbgthng.com) he created apps for various high profile clients and is now launching Wallflower, an iPad app that turns your device into a smart home control panel.