Beijing Zitiao Network Technology filed 2 patent applications this quarter across 2 categories: VR & AR (1) and Cloud Gaming (1).
The VR & AR application covers an eye-tracking system that dynamically adjusts depth sensor camera parameters based on user gaze direction to optimize performance and visual quality in XR devices. In Cloud Gaming, the filing describes a Platforms that analyzes professional player videos to automatically suggest optimal move sequences, allowing less experienced players to execute expert strategies with a single click.
The single VR & AR application addresses resource allocation in extended reality headsets by linking eye-tracking data directly to depth sensor configuration. Rather than operating depth cameras at fixed settings across the entire field of view, the system detects where a user's gaze falls and adjusts sensor parameters in that region accordingly. This gaze-driven approach concentrates computational power and sensor fidelity on areas receiving active attention while dialing back resource intensity in peripheral zones, potentially lowering both power draw and processing load.
A cloud gaming patent transforms passive video tutorials into executable in-game actions by extracting strategic sequences from professional player footage. The system performs semantic analysis on expert gameplay videos, matches those patterns against the current state of a player's live session, and surfaces relevant move sequences as one-click options within the game client. Instead of watching a guide and manually replicating complex inputs through trial and error, novice players can preview suggested operation chains and trigger them directly, bridging the gap between observing high-level play and performing it.