This month's Graphics & Rendering category includes 4 granted patents from Activision Blizzard (1), EA (1), Sony (1), and Tencent (1).
The patents cover methods for optimizing visual performance in games, including EA's distance-based particle coloring system for environmental effects and an automated approach to generating Level of Detail meshes using point clouds. Tencent's patent describes GPU-based parallel decoding to accelerate texture decompression, while Sony's technology dynamically morphs avatar meshes to visually represent player progression in multiplayer environments.
EA received 1 patent for a mesh particle rendering system that adjusts visual effects based on how far particles are from objects in the game world. The technology applies different colors to particles depending on distance thresholds, allowing smooth transitions as spatial relationships change during gameplay. This approach creates environmental effects that respond dynamically to moving game elements rather than relying on pre-rendered or static particle systems.
Tencent received 1 patent covering GPU-accelerated texture decompression for faster image loading in games. The system shifts the decompression workload from the CPU to the graphics card by dividing compressed textures into blocks and processing them in parallel across independent shader workgroups. This parallel approach speeds up the decompression process while freeing up CPU resources for other game operations.
Activision Blizzard received 1 patent for automated generation of Level of Detail meshes, which are simplified versions of 3D models used to maintain performance when objects appear at varying distances. The system positions virtual imaging probes in spherical or hemispherical patterns around game environments and uses GPU processing to determine optimal probe placement, reducing the manual work typically required from artists. The approach generates meshes with fewer triangles compared to traditional grid-based methods.
Sony received 1 patent that automatically changes the physical appearance of player avatars based on their in-game actions over time. The system monitors how frequently players perform certain activities and progressively deforms the character mesh to reflect those behaviors, making stat changes and playstyle choices visible to other players in multiplayer settings. The deformations can be applied or reversed depending on action frequency within defined time periods, and the system uses weighting coefficients tied to specific avatar attributes.