In this fork, I have added the feature of rendering video previews to your selected papers. This creates an engaging and dynamic way to showcase your research at a glance.
How Video Previews Work
When a visitor browses your selected publications, they’ll see an autoplay video preview that gives a quick visual overview of your paper. This is particularly effective for research that involves visual elements, user interfaces, or interactive systems.
Here’s how some of my selected papers appear with video previews:
Beyond the Phone: Exploring Phone-XR Integration through Multi-View Transitions for Real-World Applications
Fengyuan Zhu, Xun Qian , Daniel Kalmar , Mahdi Tayarani , Eric J. Gonzalez , Mar Gonzalez-Franco , David Kim , and Ruofei Du
In 2025 IEEE Conference Virtual Reality and 3D User Interfaces (VR) , Saint Malo, France, 2025
Despite the growing prevalence of Extended Reality (XR) headsets, their integration with mobile phones remains limited. Existing approaches primarily replicate the phone’s interface in XR or use the phone solely as a 6DOF controller. This paper introduces a novel framework for seamless transitions among mirrored, magnified, and augmented views, dynamically adapts the interface with the content and state of mobile applications. To achieve this, we establish a design space through literature reviews and expert workshops, outline user journeys with common real-world applications, and develop a prototype system that automatically analyzes UI layouts to provide enhanced controls and spatial augmentation. We validate our prototype system with a user study to assess its adaptability to a broad spectrum of applications at runtime, reported its strengths and weaknesses, and suggest directions to advance the future adaption in Phone-XR integration.
BISHARE: Exploring Bidirectional Interactions Between Smartphones and Head-Mounted Augmented Reality
Fengyuan Zhu, and Tovi Grossman
In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems , Honolulu, HI, USA, 2020
In pursuit of a future where HMD devices can be used in tandem with smartphones and other smart devices, we present BISHARE, a design space of cross-device interactions between smartphones and ARHMDs. Our design space is unique in that it is bidirectional in nature, as it examines how both the HMD can be used to enhance smartphone tasks, and how the smartphone can be used to enhance HMD tasks. We then present an interactive prototype that enables cross-device interactions across the proposed design space. A 12-participant user study demonstrates the promise of the design space and provides insights, observations, and guidance for the future.
PhoneInVR: An Evaluation of Spatial Anchoring and Interaction Techniques for Smartphone Usage in Virtual Reality
Fengyuan Zhu, Mauricio Sousa , Ludwig Sidenmark , and Tovi Grossman
In Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems , Honolulu, HI, USA, 2024
When users wear a virtual reality (VR) headset, they lose access to their smartphone and accompanying apps. Past work has proposed smartphones as enhanced VR controllers, but little work has explored using existing smartphone apps and performing traditional smartphone interactions while in VR. In this paper, we consider three potential spatial anchorings for rendering smartphones in VR: On top of a tracked physical smartphone which the user holds (Phone-locked), on top of the users empty hand, as if holding a virtual smartphone (Hand-locked), or in a static position in front of the user (World-locked). We conducted a comparative study of target acquisition, swiping, and scrolling tasks across these anchorings using direct Touch or above-the-surface Pinch. Our findings indicate that physically holding a smartphone with Touch improves accuracy and speed for all tasks, and Pinch performed better with virtual smartphones. These findings provide a valuable foundation to enable smartphones in VR.
PinchLens: Applying Spatial Magnification and Adaptive Control-Display Gain for Precise Selection in Virtual Reality
Fengyuan Zhu, Ludwig Sidenmark , Mauricio Sousa , and Tovi Grossman
In 2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) , Sydney, New South Wales, Australia, 2023
We present PinchLens, a new free-hand target selection technique for acquiring small and dense targets in Virtual Reality. Traditional pinch-based selection does not allow people to precisely manipulate small and dense objects effectively due to tracking and perceptual inaccuracies. Our approach combines spatial magnification, an adaptive control-display gain, and visual feedback to improve selection accuracy. When a user starts the pinching selection process, a magnifying bubble expands the scale of nearby targets, an adaptive control-to-display ratio is applied to the user‘s hand for precision, and a cursor is displayed at the estimated pinch point for enhanced visual feedback. We performed a user study to compare our technique to traditional pinch selection and several variations to isolate the impact of each of the technique’s features. The results showed that PinchLens significantly outperformed traditional pinch selection, reducing error rates from 18.9% to 1.9%. Furthermore, we found that magnification was the dominant feature to produce this improvement, while the adaptive control-display gain and visual cursor of pinch were also helpful in several conditions.
How to Add Video Previews to Your Papers
Adding video previews to your publications is straightforward. In your BibTeX entries (typically in _bibliography/papers.bib), you need to add two key elements:
Mark the paper as selected by adding selected={true}
Add a video preview by including preview={YourPaperName.mp4}
Here’s an example of a BibTeX entry with video preview:
@INPROCEEDINGS{PaperKey,author={Author, A. and Researcher, B.},title={Your Amazing Research Paper},booktitle={2025 Conference Proceedings},year={2025},pages={1-10},abstract={Your paper abstract goes here...},selected={true},preview={YourPaperName.mp4},pdf={YourPaper.pdf},video={https://youtu.be/your-full-video}}
Video Requirements and Tips
For optimal results with video previews:
Format: Create MP4 files for best compatibility
Duration: Keep videos short (5-15 seconds) to quickly convey the key visual aspects
Size: Optimize file size to ensure fast loading
Content: Focus on the most visually interesting aspects of your research
No Audio: The videos play muted, so visual content is key
Fallback Image: The system automatically uses a JPG with the same name as fallback
File Organization
Place your video preview files in the /assets/video/publication_preview/ directory. The system will automatically look for them there.
Benefits of Video Previews
Adding video previews to your selected papers offers several advantages:
Increased Engagement: Visitors are more likely to notice and explore papers with dynamic content
Better Understanding: Complex concepts can be quickly conveyed visually
Professional Appearance: Creates a modern, interactive showcase of your research
Differentiation: Stands out from traditional static publication lists
By implementing video previews for your key publications, you can create a more engaging and informative academic website that effectively showcases your research contributions.
Enjoy Reading This Article?
Here are some more articles you might like to read next: