The mirror is made using an old iPhone SE, running a custom made SwiftUI app. At certain times of the day, the display changes. For example, in the morning, it updates to show you sleep score – before I head out the door for work it changes to show my commute traffic, when I work from home it shows my next meeting, and when I come home from work it shows the time and weather conditions.
The two-way mirror then sits on top of the phone. Since it's completely dark where the display is, the only light that escapes through the two-way mirror is the pixels from the phone's display. And that's how it works! Pretty simple actually.
I used Augmented Reality to prototype form quicker than building real life models, for instance times that I couldn't be in the workshop. I used Fusion 360 to model my designs, and Substance Painter to texture them. Finally, I exported them as a .USDZ file and can share to people to be visualised in their own spaces, in life size. AR mode only works on iPhones and iPad.