In May 2005, I was working as a Software Engineer for Northrop Grumman TASC. I heard from some colleagues of mine that the Software Engineering Directorate (SED) was starting a new program of Basic Skills Trainers (BSTs) using video game based technology. The program was a spin-off of the America’s Army project run by the Office of Economic and Manpower Analysis (OEMA) out of West Point. I applied for the program and was placed initially as a Software Engineer on the team.
My first task was to create a simulated Chemical Agent Monitor (CAM) as a first-person device in the Unreal Engine 2 within one month. I set out to model and animate the device using Maya. Once I had implemented the animations, I then programmed the FPS classes to allow for interaction with the device. Finally, I created a volume type that could be placed within a level to define the type of chemical agent. The device behaved as in real life, taking time to warm up and a time delay once in the volume to properly detect the level of chemical agent. I completed the prototype on time and the subsequent additional task of implementing a Radiac AN/VDR-2.
After my initial prototype development, the management team elevated me to Project Manager of the then development team of six. We tabled the N.B.C. Dismount work at that time and started on a new new BST for the Common Remote Operated Weapons Station (C.R.O.W.S.), a fearful 50 caliber, remote operated machine gun that sat on the top of an up-armored Humvee. This impressive weapon system was currently deployed to Iraq and Afghanistan in support of the conflict there. Soldiers were able to sit a up to 3 miles away and use the F.L.I.R camera to identify sniper targets and eliminate them through solid boulders. However, the system was so new that very few soldiers were effectively trained and certainly had not received adequate training hours for competent on-the-move target acquisition.
We developed a robust trainer that could connect to the fielded equipment and allow the soldier to use the existing joystick and display while engaging in the virtual world. In conjunction, we worked on another similar project, the Javelin missile trainer. This trainer likewise allowed the user to handle real world equipment as the input device, in this case the targeting system or C.L.U., yet engage targets in the virtual game environment.
Our work on the Javelin was truly revolutionary in that it used the very same image recognition algorithm system for target tracking as in the real missile system.
Our work on the Javelin was truly revolutionary in that it used the very same image recognition algorithm system for target tracking as in the real missile system. To date, nothing like it had been done. The missile used a camera system at the front end of the missile to track the target as it flew through the air, updating multiple times per second. If the tank veered or moved, the missile would continue to track the target to completion if the initial lock sequence was accurate.
After completing the Javelin trainer to Raytheon’s specification and the C.R.O.W.S. to the U.S.Army’s specification, we diverted our attention to back to N.B.C.Dismount and N.B.C.R.V. efforts for the C.B.R.N.Chem School in Aberdeen, M.D. I was sent to Dugway Proving Ground, which happened to be my father’s childhood home and my grandfather’s command post, to observe C.B.R.N. procedures for reproduction in the virtual game environment. This was a real treat as the location was off-limits to anyone not invited to the post for official business. When I entered the location I was greeted by two M1A1 Abrams tanks looking down the barrel at my vehicle. From there it got a bit surreal.
But I digress. The point is that I managed projects across Classified and unclassified realms, all to completion and on time. For this reason, when the failing America’s Army 3 development team needed managerial help, I was assigned to go to California to take over the ailing development and help get it on track.