Cocoa Kinect Wrapper

I have wanted to do this for a long time but finally I have released a, hopefully, easy to use Objective-C wrapper for the libfreenect library!

You can find this wrapper in my GitHub repo where you can download and use it fairly freely.

Also, if you want to make edits to it or fix problems you find then please fork the repo and send me a pull request :)

If you do have problems though and cannot fix them yourself then please submit a new issue and I will do what I can.

If you haven’t looked at it already, check it out now at https://github.com/jimjibone/cocoa-freenect

Finding a responsive CSS system

I have spent a stupid amount of time looking for a responsive CSS theme to use for my website and I just haven’t been able to find something that I like. It has driven me mental, and as a result, has caused be to go through 9 versions of the site. If you look at the GitHub source you will see how many attempts I’ve made to make something nice.

A few notable mentions are:

  • Bootstrap - I like how amazingly responsive it is and how easy it can be to build pages with, but you do get a very stong feeling of “this is an application”, something I didn’t want for my little profile/cv page.
  • Frameless - It’s a very nice idea, allowing you to use LESS and make individual elements responsive in their own way. However, it required too much effort from me to make a nice website, so I scrapped it.
  • cssgrid - It makes me sad that he’s retiring it. It is also very nice. Very minimal. It is just a bare bones grid with no styles or anything. At the time, this was perfect, but I still wanted a bit more styling for some things done for me.
  • And finally, Gridism

Gridism is also the current winner! This is due to the simple HTML required to make a responsive grid and its nice, simple styling that looks great already for what I want.

For the time being, you can see what I have created using Gridism by going to jamesreuss.co.uk/proto6 jamesreuss.co.uk

Tagged , ,

Back

I haven’t posted here for a long long time. And I think it’s about time that got fixed.

Since I last posted I have done a lot. I have finished my helicopter project, acquired a coffee machine, finished another Kinect related project and done A LOT of tinkering.

It’s website time

sitefull20percentSo as you can probably see, it has been a very long time since I last posted! Well this would be because of the workload from my 3rd Year Project working with the Kinect. Well that’s completed now, and I won an award for it, and now I’m using all that expertise in my 4th Year Project!

But here’s some new information, I have been building a website, it’s hosted by GitHub and I have a nice url to go with it, it’s jamesreuss.co.uk. Check it out. It’s not much to look at the moment but as time progresses it will contain a lot more information about my projects and in particular, processing with the Kinect. And I can foresee that most of all the code will be developed on a mac for macs :) apart from of course all the code thats developed for Arduino and other platforms like that…Raspberry Pi.

I will also be working on a way to get some sort of blog set up on that site too, so this blog may actually become redundant… but I think it’s probably for the best because I’ll be able to share to code and how I did it with the internet.

In the mean time… jamesreuss.co.uk

Tagged

The Beginning of the Cocoa Kinect Example

Edit: I now have a new slightly polished cocoa-freenect wrapper for use in your Kinect-Cocoa projects! Check out the post here.

Today I pushed the final application of my libfreenect on the mac beginners guide to GitHub for everyone to see and download. Hopefully this will help a lot of Kinect beginners get started with their projects and produce some cool things :)

I did forget to mention in the Readme but you will need to be running Mac OS X 7 (Lion) to build this program. Im not sure if the included app will work on older versions but you can give it a go :) If however you would like to run it on an older version you will need to change some of the code in the “Kinect Processing.m” file. The code causing problems is the “@autoreleasepool {}” function which is not present in earlier versions of the SDK, so you will have to change this to its older version. It’s not too hard though and I won’t be making the change myself I’m afraid because the code looks nicer this way :).

Also, for all of you out their without a Kinect I have included a sample point cloud file which you can import into the app (in all the usual mac ways. i.e. dragging the file onto the app icon, double clicking the file, using the File->Open… menu and of course by using the “Import” button in the app.

I hope you enjoy :) and here is the Readme that is on the GitHub page:

OpenKinect Cocoa Example

This uses the libfreenect library produced by the good people of the OpenKinect community. This code gives an example on how to use the libfreenect library with your Cocoa applications in Mac OS X.

It took me ages to learn how to begin programming with the Kinect on my Mac and there wasnt a great deal of help on the internet that I could find :( so I spent a long time figuring it all out (especially with OpenGL, that thing is a bastard) and then I finally created this app which will form the final application to a guide I will make in the summer.

The guide will take a semi-begginner programmer (someone who is already experienced with Objective-C, im not going to go and teach that but I will give a link to a guy on youtube who taught me), show them how to install all the libraries they need and then take them though all the steps necessary to produce this code.
To be honest I wish I found this on the internet myself ha ha, oh well :) I like working things out.

To use this code you will first need to install libfreenect:
- Theres the OpenKinect website which will be more up to date – http://openkinect.org/wiki/Getting_Started
- Or there is my website where I have outlined a method – http://jamesreuss.wordpress.com/2012/01/28/installing-openkinect-and-opencv-the-easy-way-on-a-mac/

And then you will need to download the code from this GitHub page, your best bet is probably using the download as .zip button or by going into your Terminal app and pasting in:
git clone git://github.com/jimjibone/OpenKinect-Cocoa-Example.git

You can then open up the “OpenKinect Cocoa Example.xcodeproj” file and build & run it and have a play. Make sure you have a Kinect though ;)

A feature you might like though is where you can export and import point cloud files (.pcf), I’ll include one in there for you to play with if you dont have a Kinect yet.

Tagged , , , , , , ,

Weeeyyy

A quick update :)

Full screen and efficient Kinect viewing.

So, here is a bit of news about the project so far:

  • I have made quite a bit of progress with my project using Processing (link) but the problem was that; it was not very efficient at drawing to the screen, and I didn’t have full reign over the data, but it was very easy to use.
  • So then I made some more progress on the Objective-C and Cocoa side of the project (as you can see above) and things are going good :) Now I just need to get some object detection working which I managed when using Processing. I will show this off soon.

What I’m going to do soon:

  • A beginners guide to using the Kinect in Cocoa :) seeing as there isn’t much help out there for beginners (like I was..) I will be creating this guide to help out all them mac programmers. Also, once its done users of other operating systems may also be able to follow some of the methods used when building their own apps.
  • Get some more pictures of object detection up!
  • Then some helicopter tracking..

Awesome.

Tagged , , ,

Lots of Kinect Point Cloud Views!

Well, today, wow, oh my god, his PA, this is a big one. So today I managed to get a few things better with the Kinect cloud view in the KinectiCopter app.

The view is able to display in both colour (RGB data from kinect) and just plain white. I have also created a new KCKinectController class which is able to collect the kinect data and control the display of 4 separate NSOpenGLViews, isometric (just rotatable at the moment, might stay this way..), front view, top view, side (right) view.

Pictures!

Also, error handling! well, error handling for there being no kinect connected. if the program detects no kinects connected then it will display a message in the console and quit 2 seconds later. Obviously in later versions this will be changed so that the user will receive a better error message that they can actually see without opening the console app.

I’ll put up a post sometime soon showing how I managed to get so many instances of NSOpenGLView all showing one point cloud with just one set of calls for the Kinect point cloud. Then everyone will be able to do it :) except I won’t be putting up my own code cos thats for my uni project!…

Lovely stuff.

Tagged , , , ,

How can I do some processing of Kinect data?

The next task in my project, find out the location and size of all the objects in the scene! Sounds a bit tricky…

After a quick bit of looking around I found PCL (the Point Cloud Library) which contains a lot of functions all to do with manipulating and calculating features and other things to do with point clouds, the stuff the KInect gives us :) good stuff.

Lets give it an install and have a play with the tutorials (they look wicked, definitely the best tutorials for a library I’ve seen so far :) ).

So. After trying many attempts of an install over a week! I finally got it to install properly. That was soooo hard! I documented down the easy way of doing it in the end so I can share it if this next bit works out… After it was all installed properly everything seemed awesome until I tried to compile a sample C++ code in Xcode. This did not work out so nicely :(

I think the problem is that Xcode does not see the correct header files or it is looking in the wrong place for them. When trying to #include the PCL libraries it makes you put the folder “plc-1.4/pcl/…” at the front which messes up with all the other #include statements within the header files as they are only looking for “pcl/…”.

So after lots of hunting on the internet for any way I can fix this I tried compiling the sample code with CMake using the defined method and it worked fine! This led me to believe that the libraries were all working fine and that it was just something to do with how Xcode works.

So, I did lots more searching around and found a post (look near the bottom) which talks pretty much about the problem I’m having I think. So what I need to do at some point now is uninstall homebrew and the libraries it installed and install macports and install it that way. Which will involve A LOT more painful installing and compiling (because macports takes loads longer for some reason) and I find its harder to use (I love how simple the commands for homebrew are :) )

So, when I eventually get round to installing this, which may be sometime soon, I’ll be able to come back and tell everyone how it went and how to fix/avoid all the problems that I came in to.

Please, please, please let PCL work. It would make things sooooo much easier!!

bla

Tagged , , ,

Installing OpenKinect and OpenCV the easy way on a Mac

Edit: I now have a new slightly polished cocoa-freenect wrapper for use in your Kinect-Cocoa projects! Check out the post here. On the repo there also new and improved installation instructions!

For ages I tried to find an easy way to install both libfreenect (OpenKinect) and OpenCV on my mac without having to follow loads of lines of instructions and getting lots of errors in the process.

So, after quite a while of searching around for an easy install, which didn’t involve me installing loads of different package managers, I came across these methods which work really nicely :)

First, Install Homebrew, if you don’t have it already:

  • It’s super easy… just follow the instruction… GO!
  • That was crazy fast, now do one of these, or both, or none…

OpenKinect:

  • If you still have your terminal open type in (or copy and paste) these commands one at a time and press enter.
cd /usr/local/Library/Formula

 

curl --insecure -O "https://raw.github.com/OpenKinect/libfreenect/master/platform/osx/homebrew/libfreenect.rb"
curl --insecure -O "https://raw.github.com/OpenKinect/libfreenect/master/platform/osx/homebrew/libusb-freenect.rb"
brew install libfreenect
  • Then, once all thats finished you’re done for OpenKinect :) you can give it a test by typing this into the terminal (as long as your still in the install directory..)
glview

OpenCV (2.3.1a):

  • Now, OpenCV is even easier again! Just type in (or copy and paste again) this command:
brew install opencv
  • This one takes a bit longer than OpenKinect…
  • And then, You’re all done!
  • You can close terminal now :)

Now the last task. If you want to use these libraries in Xcode just follow these steps:

  • These instructions are for Xcode 4 by the way. Sorry Xcode 3 people, I started using Xcode from version 4.. But don’t worry Xcode 3 people, if you know what you’re doing then this is pretty much the same process as using the built in Mac libraries.
  • If your creating a Cocoa application it’s a little simpler:
    • Open your YourApp.xcodeproj file
    • Select the target you want to add it to, there will usually only be one
    • Then click “Summary”
    • And under “Linked Frameworks and Libraries” press the plus button
    • In the search field that shows up type in “libfreenect”
    • You will see all the libraries that begin with libfreenect. So now you can choose one :) I usually use libfreenect_sync because it is easier to use this in objective-c programs, from what I have seen so far. So select the latest version of it that shows up.
    • That’s it for that. Now to include it in your application code just type this #import <libfreenect/libfreenect_sync.h>
  • Now, if it isn’t a Cocoa application but a Console application do these things:
    • Open your YourApp.xcodeproj file
    • Select the target you want to add it to
    • Click on the “Build Phases” tab
    • Then under “Link Binary With Libraries” press the plus button
    • In the search field that shows up type in “libfreenect”
    • Now you can select which library you would like to use as I discussed in the second to last point of the Cocoa app method.
    • All done!

So now you should be all sorted with OpenKinect and OpenCV (the libraries when searching are called libopencv… there are quite a few tho, don’t really wanna go into what does what and what you need and all that… I don’t really know that myself just yet..)

Anyways, bye bye everyone

Tagged , ,

Making little bits of progress :)

So, at the moment I’m in the middle of my January exams :( so work on the project is taking a very far back seat. But I decided to a have a little go at making a few things tidier and do a little experimenting.

Firstly, I came up with a name! I decided on “KinectiCopter”. I thought Helicopter and Kinect and stuck the two together (if you didn’t realise).

I then thought up a few ideas for a logo/icon for my project/application and finally came up with (after a lot of time staring at the screen) the following logo :)

KinectiCopter Icon Version 3

KinectiCopter Icon Version 3

After sorting a new icon for my KinectiCopter application I hadn’t made yet I was itching to do something, so I decided to put off revision a little more and began to make a new project which may well be the final one. I copied over all the work I had previously done on the Kinect into this project. So, I had my KCKinectController class which took care of the interface between the libfreenect_sync library and my objective-c program, and previous work where I had got the video and depth data to display using OpenGL/OpenCV.

I then couldn’t resist to get another OpenGLView class made, which instead of showing the user 2D images of video and depth, it showed the user a Point Cloud view of the scene. There was an example of this in the OpenKinect examples, so I had a long hard look at this to try and work out what it did and how and then I recreated it in my own program :)

When building it it seemed to work fine except that the view seemed really zoomed out compared to the example and when I did zoom in things looked a bit like they were overlapping… I had a quick think and then decided instead of using my own KCKinectController class to get the depth and video data I would just go straight to libfreenct_sync and get the data from that. This seemed to fix everything! But why? No idea. But I do have a feeling that it has something to do with me converting the depth/video data to an IplImage for OpenCV and then converting it back to data.

So, I’m now thinking that instead of getting the controller to convert to IplImage and then to data, just get the controller to give the user the data it collected in the first place and then have some over methods for converting for when I want to use OpenCV.

Thats what I’ll try next :)

Tagged , ,
Follow

Get every new post delivered to your Inbox.