The mouse pointer is central to direct-manipulation graphical user interfaces, and modern desktop operating systems provide accessibility features that enable pointer control via eye or head movement. Existing solutions, such as Eye Control on Windows and Head Pointer on macOS, rely on continuous video capture, which makes them sensitive to lighting conditions and user position and raises privacy concerns. In this work, we demonstrate a head-pointing approach based on head movements captured by the gyroscope and accelerometer sensors of commercial headphones. To improve pointing precision, we implement pointer snapping that leverages accessibility information from application user interfaces. This approach is independent of camera placement and lighting conditions, offers privacy advantages, and requires no specialized hardware beyond commonly used headphones, supporting more inclusive and accessible interaction. Our demo highlights how reusing existing accessibility infrastructure can support more inclusive pointing interactions and contribute to creating more accessible interactive systems.