There are things I've written for fun. They're mostly unfinshed, unpolished, and unworthy of being posted. Proceed with caution.
2015: Homemade geeky board" game
- Free roll of HUGE paper from some friends
- 4 enthusiastic space background drawers/decorators, who each drew/colored their own space ship
- A bunch of cut-up pictures of enemy space ships and ship cargo (cows, ponies, piles of glitter, and fertilizer)
- 1 set of outrageous rules made up by daddy
- Some silly descriptions of how your cargo got out of control and you're now covered in glitter/cow driving your space ship/being licked by ponies
- A big boss bad guy at the end that almost took the whole family out
- Lots of laughs
2013: "Captain" (working title) -- click picture to install APK on Android. Quite unfinished.
2001: CIS 736/541/542 final project "Robots in a Maze"
Aaron Schroeder & Ben Claar
For our final project, we developed an OpenGL 3D display for our robotics project in CIS 541. We used standard GL functions to accomplish our project, which has features such as texture mapping, viewed object occlusion detection and clipping, camera angle and position adjustment, and viewed object position adjustment.
The most interesting part of our project as far as computer graphics is concerned is the clipping of walls that obstruct the camera’s view. When the robot or object that is being focused on is hidden behind a wall, the section of the wall that occludes the view is made transparent. This allows the viewer to see both the wall and the viewed object at the same time, giving a smoother effect than if we completely removed the offending wall.
The clipping had to be in three dimensions, since it would not appear correct if a wall that was in front of the camera but not obstructing the view was clipped. To accomplish this clipping, we originally considered traditional computer graphics algorithms such as a three dimensional version of the Liang-Barsky, Cohen-Sutherland, or Cyrus-Beck algorithms, as documented in Foley (1997). However, these algorithms failed to take in to account the extra information available for clipping against our walls, so we developed our own clipping algorithm to accomplish the task.
Figure 1. Diagram of our custom clipping algorithm.
The clipping problem of our scene can be broken into two components. Firstly, does the a wall obstruct the view of the object by the camera. If so, then secondly, clip a section of the wall. This is much faster and simpler than clipping a view volume against a plane segment, which would have been unnecessary and inefficient.
For the first part, it is sufficient to test if the line from the camera to the viewed object passes through any wall. To accomplish this, we used a line-plane intersection algorithm we found on geometryalgorithms.com (April 2001). To implement this algorithm, we modified the C++ code found on that page which is copyrighted by softSurfer (www.softsurfer.com), but freely useable (under certain terms; see our source code for the full copyright notice). After testing for line-plane intersection, if it intersects, we test to see if the intersection point falls within the wall we are testing for. If so, we make a section of the wall transparent.
To determine how much of the wall to make transparent, we defined a clipping width of approximately twice the size of the robot. We then split the wall into three separate walls before displaying it, making the center wall transparent. If the clipping width is larger than the entire wall, the whole wall is drawn transparent. Also, if the transparent wall is at the end of the wall, then the end wall will have length zero and so will not be drawn.
Conclusion and Performance:
We feel that our project turned out very well. We achieved very useable performance on the Linux machines in the Linux lab at K-State, which are not high-powered machines and have no hardware OpenGL acceleration. The presentation ran very smoothly for our CIS 541 presentation, which was on a Pentium III laptop with hardware OpenGL support, at an estimated 15+ frames per second. Although it was not flicker free, it did well achieve its intended goal to map our robot presentation in real time.