Robotics @ Penn GRASP
Deep Learning / Computer Vision Intern @
Kodiak Robotics
Email:
jchunx@seas.upenn.edu
LinkedIn:
Jason Xie
GitHub:
JChunX
Resume:
PDF
Examples:
Q: Is Jason a good fit for role X?
Q: Show me a photo of Jason's Stable Diffusion Project.
Q: Tell me about a project where Jason used PyTorch.
Note: LLMs like JasonGPT may produce inaccurate responses.
Hi, I'm Jason ðŸ¤
I am a Robotics grad student at the University of Pennsylvania.
I am currently...
In Summer 2023, I interned at Kodiak Robotics and worked on deep learning & vision models for autonomous trucking.
Seeking full-time software engineering roles starting Summer 2024.
Technologies: WebGPU, JavaScript, tinygrad
A first-ever implementation of ControlNet diffusion running locally in the browser. Re-wrote ControlNet using tinygrad and compiled to WebGPU. Optimized for performance through fp16 inference and compute pipeline overhead reduction.
Technologies: C++, CUDA, OpenGL
A path tracer written in CUDA.
In this project, I take advantage of the massively parallel processing power of GPU to render photorealisitc images. The end result is a interactive path tracer featuring specular and diffuse shading, along with several optimizations to improve performance.
IsaacSim Gym environments for embodied agents.
Technologies: PyTorch, Multi-Modal LLMs, Mujoco
We are interested in applying multi-modal foundation models towards the goal of general embodied autonomy. Rather than evaluating benchmarks in sim, we seek to validate our models in the real world using a mobile manipulator.
Progress:
Motion planning, control, data logging, and visualization. Implemented communication protocols betwen robot and inference server in WebSockets.
Verify safety limits in sim (Mujoco), integrate components on real robot.
Implement baseline techniques and evaluate tasks on new methods.
Technologies: C++, Vulkan, GLSL
A grass simulation in Vulkan.
This project implements the paper Responsive Real-Time Grass Rendering for General 3D Scenes, including Bezier curve grass blades, physical models for grass motion, and culling.
Technologies: Python, C++, ROS2, PyTorch, TensorRT, Drake
Goal: Don't crash and minimize laptime.
How: Implement a robust, safe, and fast autonomy stack on the NVIDIA Jetson edge computer.
What we did:
What we achieved:
Technologies: Python, PyTorch, Ai2Thor, OpenAI CLIP
Leveraging the power of CLIP (Contrastive Language-Image Pre-training), we adapt CLIP embeddings to learn a dense reward function for embodied agents. Using a transformer-based architecture, the reward model learns to pair video frames and corresponding language instructions by minimizing the InfoNCE loss. We showcase the model's potential for fine-tuning sparse reward tasks by demonstrating its ability to return appropriate rewards in a variety of scenarios.
Technologies: PyTorch, TensorRT, Python
Improved the robustness of a normalizing-flow based lidar localization system (Local_INN) by fusing IMU accelerometer data with neural network odometry estimates using an Unscented Kalman Filter. Created a ROS2 package for the system as a drop-in replacement for the particle filter-based localization system used by the F1Tenth platform.
Technologies: Drake, Mujoco, Python, C++
We are interested in the problem of transitioning between two different gaits for a quadruped robot. Building upon an existing convex MPC method for gait planning, we extend it to handle gait transitions by planning over multiple potential next gait states.
Technologies: C++, OpenGL
Adventures in 3D computer graphics and multi-threaded C++ programming. A simple Minecraft clone implementing custom shaders, voxel terrain generation, chunk management, and rendering.
Technologies: C++, Python, Tensorflow, ROS2, OpenCV, PyBullet
I led the software development at NCSU Pack Bionics, a student organization that designs and builds prosthetic legs for the Cybathlon Competition. Starting from scratch, I helped create the software stack for our prosthetic leg, including a Pybullet-ROS2 bridge for simulation, a vision-based gait planner, and an imitation learning based gait controller.
MSE, Robotics
BS, Computer Science & Biomedical Engineering