<?xml version="1.0" encoding="utf-8" standalone="yes" ?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
  <channel>
    <title>Human-Computer Interaction | Rongtao Zhang</title>
    <link>https://isanshi.github.io/tag/human-computer-interaction/</link>
      <atom:link href="https://isanshi.github.io/tag/human-computer-interaction/index.xml" rel="self" type="application/rss+xml" />
    <description>Human-Computer Interaction</description>
    <generator>Wowchemy (https://wowchemy.com)</generator><language>en-us</language><lastBuildDate>Sun, 16 Feb 2025 00:00:00 +0000</lastBuildDate>
    
    
    <item>
      <title>VR Surfing Hand-Tracking Control</title>
      <link>https://isanshi.github.io/project/vrsurfing/</link>
      <pubDate>Sun, 16 Feb 2025 00:00:00 +0000</pubDate>
      <guid>https://isanshi.github.io/project/vrsurfing/</guid>
      <description>&lt;h2 id=&#34;overview&#34;&gt;&lt;strong&gt;Overview&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;This project explores &lt;strong&gt;VR surfing interaction&lt;/strong&gt; with a &lt;strong&gt;VR headset&lt;/strong&gt; and a &lt;strong&gt;motorized mechanical surfboard platform&lt;/strong&gt; designed to simulate the body motion of real surfing. The main goal was to create a more immersive and physically engaging surfing experience by combining virtual-environment feedback with real-time body tracking and motion control.&lt;/p&gt;
&lt;p&gt;The project was completed in collaboration with &lt;strong&gt;Premankur Banerjee&lt;/strong&gt;, and was advised by &lt;a href=&#34;https://viterbi.usc.edu/directory/faculty/Culbertson/Heather&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Heather Culbertson&lt;/a&gt; and &lt;a href=&#34;https://ampl.usc.edu/&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;Jason Kutch&lt;/a&gt; at &lt;strong&gt;USC&lt;/strong&gt;.&lt;/p&gt;
&lt;video controls preload=&#34;metadata&#34; style=&#34;width:100%;border-radius:16px;margin-top:1rem;&#34;&gt;
  &lt;source src=&#34;env.MOV&#34; type=&#34;video/quicktime&#34;&gt;
&lt;/video&gt;
&lt;p&gt;&lt;em&gt;Experimental environment for the VR surfing setup.&lt;/em&gt;&lt;/p&gt;
&lt;h2 id=&#34;system-design-and-results&#34;&gt;&lt;strong&gt;System Design and Results&lt;/strong&gt;&lt;/h2&gt;
&lt;p&gt;The overall system integrated &lt;strong&gt;visual perception&lt;/strong&gt;, &lt;strong&gt;motion interpretation&lt;/strong&gt;, and &lt;strong&gt;physical actuation&lt;/strong&gt; into one interactive loop. The &lt;strong&gt;VR headset&lt;/strong&gt; provided the immersive surfing scene and real-time environmental feedback, while the &lt;strong&gt;hand-tracking module&lt;/strong&gt; extracted user motion from joint landmarks. The control logic mapped hand movement and virtual water-level variation to board commands, and a &lt;strong&gt;controllable mechanical surfboard&lt;/strong&gt; executed the commanded movement to reproduce wave-like motion.&lt;/p&gt;
&lt;p&gt;To enable natural interaction, we built a &lt;strong&gt;MediaPipe-based full-body joint tracking system&lt;/strong&gt; with a particular focus on &lt;strong&gt;hand landmark detection&lt;/strong&gt;. The user’s hand motion was interpreted together with changes in the virtual environment’s &lt;strong&gt;water level&lt;/strong&gt;, and this relationship was used to control the movement of the surfing board. By coupling perception and actuation in real time, the project created a closed-loop VR control system that translated body motion into physical surfing feedback.&lt;/p&gt;
&lt;p&gt;The project produced a working prototype for &lt;strong&gt;hand-tracking-based VR surfboard control&lt;/strong&gt;, demonstrating that body-joint tracking can be used to drive a motorized platform in coordination with a virtual environment and helping reproduce part of the dynamic feeling of real surfing.&lt;/p&gt;
&lt;p&gt;The main outcomes included:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;a &lt;strong&gt;MediaPipe-based full-body and hand-joint tracking system&lt;/strong&gt; for VR interaction,&lt;/li&gt;
&lt;li&gt;a control method that combines &lt;strong&gt;hand motion&lt;/strong&gt; and &lt;strong&gt;virtual water-level changes&lt;/strong&gt;,&lt;/li&gt;
&lt;li&gt;a real-time interface between the virtual surfing environment and a &lt;strong&gt;motorized mechanical surfboard&lt;/strong&gt;,&lt;/li&gt;
&lt;li&gt;and a prototype immersive experience designed to simulate the kinesthetic feel of surfing.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;More broadly, the project showed how &lt;strong&gt;computer vision&lt;/strong&gt;, &lt;strong&gt;VR interaction&lt;/strong&gt;, and &lt;strong&gt;human-centered control design&lt;/strong&gt; can be combined to create embodied motion experiences beyond standard controller-based interfaces.&lt;/p&gt;
&lt;video controls preload=&#34;metadata&#34; style=&#34;width:100%;border-radius:16px;margin-top:1rem;&#34;&gt;
  &lt;source src=&#34;surfing.MOV&#34; type=&#34;video/quicktime&#34;&gt;
&lt;/video&gt;
&lt;p&gt;&lt;em&gt;VR surfing board system in operation.&lt;/em&gt;&lt;/p&gt;
</description>
    </item>
    
  </channel>
</rss>
