So to summarize:
1 What linux compatible software do I need to make sense of the Kinect data?
2. Is a good open source hexapod kinematics library available?
3. Is there any real-time simulation software (aka game engine) available for this application?
Hexapods really don't have much of a payload ability. But yes there is Hexapod source code but you will be disappointed the IK is simple as the legs are co-planer and you can use simple geometry.
http://kkulhanek.blogspot.com/2013/01/inverse-kinematics-for-3-dof-hexapod.htmlWith only 3 DOF there is a unique solution so you don't have to choose
Google will find lots of source code for the phenix hexapod. The basic arduino based hexapod software is on-line but is the very unsophisticated. It is really only set up for teleoperation (remote control) and not for autonomous operation.
Their other problem is knowing exactly where the robot is located. "leg odometry" is very non-exact. You get the best location data from wheels with encoders on the motors. Legs slip a LOT. Your 3D sensors data is not as us full as you'd like if you don't know the location of the sensor. Of course you can use the sensor data to help refine your location, if you already have a map. It's a chicken and egg problem called "SLAM" You can Google SLAM algorithms
You will need multiple ways to know your location, wheel encoders, or if you must use a hexapod, counting steps and direction and adding many of those. Also accelerometers and gyros and magnetic compasses help a LOT. Feed all this into a Kalman filter (Google will find loads of info on Kalman) the filter will give you good location.
The above is how the self driving cars work, except for motion planning
Game software is a good idea for visualization. You can use a game engine to make a nice display but then ROS has Gazebo integrated into it. At the level of complexity you are looking at. I think you are looking at a ROS based solution rather then Arduino or uP.
http://www.ros.org/core-components/I would start with a small wheeled robot. Get it to the point where it "knows" its location from simple sensors like wheel encoders, IMU and GPS. This will take weeks of work and some study, maybe have to re-learn Linear Algebra? Then you can add sensor like a camera or sonar or IR rangefinder or if you have a space $3K a laser scanner. Then later try to make all this fit into the payload ability of a hexapod (or quadacopter or whatever) The key is to fuse multiple sensors
But BEFORE you do anything else watch this on-line class (and do the exercises) It covers the basics of how the Google self driving car works and you need to do the exact same thing
https://www.udacity.com/course/cs373