Solar radiation of large-area glazing is essential for optimising energy consumption and reducing occupant glare and overheating. Automated shading in real-world buildings are actuated when measurements of the outdoor conditions (e.g. solar tracking or irradiance) or indoor environments (e.g. lux levels) exceed pre-established set-point values. However, these traditional proxies are often insufficient to capture actual occupant visual and thermal preferences, resulting in user dissatisfaction with the automatic system and the indoor environmental conditions, which in turn lead to a reduction of productivity. Affective computing and Internet of Things provide an unprecedented opportunity to directly include occupants in the shading control loops, but the means of capturing transient occupant preferences have yet to be fully-developed. This paper describes the underlying research and demonstration prototype of a novel occupant-centred system that captures occupant preferences in real-time and controls the shading of glass facades, thereby ensuring visual comfort in an intelligent manner. The prototype consists of wearable sensors and a facial action unit detection system that together provide occupant-centred information for controlling shading and/or switchable glazing. The prototype is demonstrated in a living laboratory, and its performance is tested against conventional shading control systems and across several occupants.