This semester I’m writing my master thesis and parallel to it I have to do an advanced seminar at my university. This seminar covers state of the art topics from real time computer graphics. Here is the abstract of it and the full paper as a pdf.
Recent video games increasingly try to produce a more cinematic atmosphere. An important part of cinematography is depth of field and motion blur. With depth of field a director can focus a viewer’s attention to a certain region of the scene. It can be seen as an artistic tool for motion pictures. To bring video games more closer to motion pictures this artistic tool is also widely used in recent video games. Although motion blur is more of a by product relating to the physical limitations of photo and video cameras, it can also be used as an artistic tool to visually enable a better perception of movement and actions in motion pictures and video games.
This report takes a look at some state of the art techniques for motion blur and depth of field rendering on modern graphics pipelines in realtime. Depth of field is covered with an older approach that is often used and is extended with the effect of a fast aperture bokeh. Motion blur is covered with a rather new approach that can also be used for depth of field. This approach is intended for future generations of graphics hardware and is not production ready today.
You can download the full paper here.
Lately I used OpenCV with an Xcode project. Maybe someone else needs it too, so i thought I’ll write down what I had to do to get get running.
The easiest way to install OpenCV on mac os x is with homebrew. http://mxcl.github.com/homebrew/
brew install opencv
This should install all the opencv libs. I had some error coming up about a missing fortran compiler so I had to do a
brew install gfortran
brew install opencv
- Create a new Xcode project and add the paths to OpenCV to the header search paths and the library search paths. (Already configured in the attached project)
HEADER_SEARCH_PATHS = /usr/local/Cellar/opencv/2.4.2/**
LIBRARY_SEARCH_PATHS = /usr/local/Cellar/opencv/2.4.2/**
- Set the linker flags for the required libs. (Already configured in the attached project)
OTHER_LDFLAGS = -lopencv_core -lopencv_highgui -lopencv_imgproc
- If you now build the project, there will be an error in ‘lsh_table.h’. Replace the line with the error: “if (!used_speed_) buckets_…” with the following:
buckets_space_.rehash((buckets_space_.size() + dataset.rows) * 1.2);
That’s it we’re done!
Here is an Xcode project that has step 2 and 3 already configured.
You know I worked for more than 2 years as a Professional iOS and Mac developer at equinux ag. But in march this year I’ve started a masters in computer science. This means I got plenty of free time during my summer break. Besides surfing a lot (not the internet, the ocean and the river) I built a new iPhone app — Beatathlon.
Beatathlon is an app to create your own workout mixes based on the music from your iTunes Library. You can combine multiple parts with different paces together to one workout. So you can create workout mixes that fits exactly to your abilities and you can hear your favorite tunes. For more information take a look at the website of Beatathlete: www.beatathlon.com or buy it at the App Store
You’re out of ideas? Stuck with your project? No idea where to go from here? Just put on your sports shoes, get out, and do some sports! When you’re exhausted, go back, have a shower, and you’ll see! The ideas or solutions just pop up in your head as they where there all along. For me, this helps every time and keeps me going.
I wanted to play music in the background on my iPhone. To make the transition between two songs smooth and nice they should overlap and fade. To achieve this I used the AVPlayer class: created an instance of it and started playing it. After some time it starts fading out, creates a new AVPlayer instance with a new song and starts playing again. This works perfectly well as long as my app runs in foreground. But when it is in background. It just fades out the old AVPlayer instance but does not start playing the new AVPlayer instance.
So I looked around what I was doing wrong. Along that I figured out that I’m doing this completely wrong. Using multiple AVPlayer instances was a bad idea. Instead a AVMutableComposition should be used. That’s designed for exactly that purpose. Maybe I’m not the only one struggling with this, so I put up this blogpost about it.
Read the rest of this entry »
A few days ago this site: http://fixradarorgtfo.com/ came up on my twitter stream. And shortly after that some people complained about it, that it is disrespectful. I don’t agree with that. Maybe it’s not the nicest way to say it. But I think we can all agree on that there has to change something. Here are a few things I would like to see there because I don’t agree with all the points on the “Fix Radar or GTFO” text.
- Open: By default a radar bug should be viewable by everyone. So if I have an issue, I can search for it and see if that bug is already reported. Then I don’t have to do all the work which might include creating a new Xcode project and write some case to reproduce the error in a vanilla environment etc. etc. This would be a win-win situation. So apple does not have to search for duplicates and we can save a lot a time spent on writing duplicates. (now that I’ve wrote it down the status quo sounds even more ridiculous) For people who have to share some secret bug reports, a simple “private” would still do.
- Rewards: Every accepted bug report should give some reward. Then that reward can be traded into support incidents. That would be a fair trade. We help apple finding bugs, they help us fixing our problems. This would motivate a lot a people to write bug reports. Just look how well stack overflow does with exactly that motivation!
- Interface: As “Fix Radar or GTFO” already said. That interface is so 90’s and we’re now in 2012! In my opinion putting the bug reporting into Xcode is not such a good idea. There’s too much stuff in Xcode already. But make a new nice app and putt all the bug reporting in there. And if that’s too much, at least build a proper web app for it that’s state of the art.
- Feedback: Please apple communicate with us some time. Right now most of the time it feels like your bug reports are going straight to /dev/null. If you’re lucky you gets a “This is a duplicate GTFO” response. But most of the time you get absolutely no response for months if not years. This brings us back to the first point. Make it open and then communicate with us about what’s going on. So we know on what we can really on and on what not.
These are my four main points about radar. Any other improvement welcome too. But there have to be some changes in the near future! Developers among many others made apple the most valuable company in the world, now they should show the necessary respect by improving bug reporting.
Today when I woke up and looked into my twitter stream I saw this video (thx to @notch). It made me donate some money to them, hopefully it will make you too.
Good article about the project: http://blog.zeit.de/netzfilmblog/2012/03/08/joseph-kony-2012-social-media-uganda-film/ (in german)
Update: Sopcast is now officially available for Mac OS X. See http://www.sopcast.cn/download/mac.html
Using SopCast on the mac was always very hard. You had to use it with Windows in a virtual machine. Which made it very inconvenient. But lately I discovered that with the WineBottler.app you can build a SopCast.app with just a few clicks.
Read the rest of this entry »
Since I’ve been playing around with the Zucchini Framework. I discovered a few pitfalls. Maybe someone else stumbles upon the same problem. So I thought I make a short blogpost about it.
1. Weird “The operation could not be completed.” error:
I’ve got this error message:
and the console says
-[NSAlert alertWithError:] called with nil NSError. A generic error message will be displayed, but the user deserves better.
In this case, most likely the path to your .app bundle in the
confing.yml is wrong. Double check that that path is valid!
2. You get a “doesn’t define a screen context” message in the log:
The log sas something like
Most likely you’ve added a new line in your
.zucchini file where you shouldn’t. You should only add new lines before a “
Then on the” line and nowhere else. But if you want to structure your code a bit better. You can use “#” for comment lines.
3. When running in a CI environment the test hangs and instruments endlessly leaks memory.
I described this problem before in this post. The only workaround I could find was executing zucchini from my Jenkins over ssh.
ssh user@server "cd /path/to/checked/out/repo/ && rake"
4. Unspecified “No such file or directory” message in the log
The log shows something like
/Library/Ruby/Gems/1.8/gems/zucchini-ios-0.5.4/lib/feature.rb:49:in `initialize': No such file or directory - with no useful further info. In my case it was always the missing empty folder “run_data”. When running zucchini it expects this folder to be there. But because you don’t want some run data in your repository you normally don’t add it to the repository at all. So you have to add an empty file like “.gitkeep” or “.hgkeep” so the empty folder is added to the repository created when cloning/updating.
Wether you like to eat zucchini or not, you have to take a look at this tasty testing framework for iOS. Over time I’ve looked at a lot a testing frameworks for iOS. Now I came across this new framework for interface testing.
It’s not some completely new fancy way to test, you still have to write tests and run them. But it splits the tests in two very useful parts. In a part where you write what you want to test in a more or less natural language. And in another part where you define screens and what features every screen has. So these screens are reusable for every test.
Read the rest of this entry »