Ice Ice Fishing Devblog
Welcome to my first blog post about my journey as a game programmer. I will write about what I learned while working on my latest XR project in this article.
When thinking about what project I should work on to improve my portfolio, I decided to work with an old friend: the hand-tracking device named Ultraleap. From previous experience, I knew that directly interacting and grabbing objects would not be suited for a game that was not in VR. New users always have trouble with depth perception when interacting with a game displayed on a screen. It is easier for them to point at the screen to control a game. With this control scheme in mind, I quickly identified which game I could recreate. The fishing mini-game from Club Penguin had the perfect control scheme.
Design challenges
I started my work by writing documentation for the project. I drafted all components for the main system of how each sub-system would work. My first challenge was designing a system that handles the fish creation. I needed a way to track four different unique fish animations and to identify them. My solution was to use a 2D array representing which side they were created on and their animation number. The correct element of the list is updated by reading the information stored in the fish. When a fish is created, the system selects the side of the fish and assigns a random animation value from 1 to 7. The method instantiates the correct fish if the element at the index created by the two values is not set.
private List<List<bool>> fishTraking = new List<List<bool>>(); private void fishInitialisation(){ int side = selectSide(); // Select 0 or 1 int animation = selectAnimation(); //sellect between 0 and 7 if (animation <= 1) //special animation { if (fishTraking[side][animation] != true) { fishTraking[side][animation] = true; this.GetComponent<createFish>().createGreyFish(side,animation); } else { //We create a normal fish on the designated side. this.GetComponent<createFish>().createYelloFish(side); } } }
A second challenge was the level system and the obstacles that were allowed to be created for each level. I used two arrays: one that indicates the maximum number of each type of obstacle that can be created and another one that tracks the current number of active obstacles. The system instantiates the default obstacle if it tries to create an obstacle already at capacity. The first array is updated at each new level.
New things I learned
While working on the project, I discovered how to record specific hand positions that can be used as event triggers. In the demo file of the Ultraleap, there is a scene where you can record specific hand positions. Those can then be used with the [system name here] to trigger an event similar to a button press. I used this feature to track if the user's right hand is close to catching a fish and placing a new worm on the hook.
When developing the system to control the hook position, I learned how to track where the user is pointing. If a ray is cast from the tip of a finger, you can follow the hit position and draw a line between the tho point. The ray hit coordinate can then be used to interact with the world. This system could be used to interact with menus that are not part of the unity UI for VR project.
During the animation phase of the project, I learned how tricky the unity animation system can be. When I animated the swimming animation for the yellow fish, I realized that changing the coordinates of the sprite would prevent it from being able to move around with code. The fish would stay in the middle of the screen instead of moving around. Specific idle animation animation should be done on a child object for it to move in local space.
Final thought
With the game finally done, I can see how essential visuals are in the game. The same project would not look as impressive if I had not taken the time to create all the visuals and integrated sound effects.
I better understand how to organize different systems for more significant projects. It is essential to keep the design as separate as possible to limit the potential problems that could be created between them.
While watching my partner try out my game, I realized that users are not used to keeping their hands stable mid-air and moving them separately. I saw that his left hand kept moving while focusing on his right hand.
I decided to invest in a VR headset for my future project. My passion has always been to explore different ways for the player to interact with tasks. With a VR system, I can create new immersive experiences.
A better breakdown of the project can be found in the game section of my portfolio.