Today was a beautiful sunny day in Zürich. After a week of rain, I seized the opportunity to bike in the sun! I started biking to Spreitenbach to see the Bruno Weber Park, which is roughly 10km West of Zürich. However, either fate wanted me to go South, or I’m just bad with directions (admittedly, probably the later) because I ended up heading towards Bremgarten in the Canton of Aargau. I saw a picture of a cool old tower in the Bremgarten Alte Stadt, so I continued to bike there, which was another good option!
I recently moved to Schlieren, a city westly adjacent to Zürich. It’s a quiet city, and feels much closer to classic Switzerland than Zürich city, where I lived for two months. My co-worker also moved to Schlieren, so we decided to explore Schlieren forest, which is about a 15 minute walk from where I live.
There are trails all over the forest. Even some fire areas! We explored mainly the western half of the forest for about 2 hours, then we ended up in Uitikon, a small farming village. Unfortunately, by the time we arrived there it was too dark to take pictures on my potato camera, so I can’t show you the cows and feilds. At some point, I will definitely bike/hike back there and document said cows 🙂
In early May, I moved to Zürich for Software Development internship at Verity Studios AG.
My roommate and I explored a bit, and found this cool park called MFO, which was a massive metal cube structure with vines growing everywhere. The picture (or Google Maps) don’t really do it justice!
We also found the obligatory bull of Zürich:
In the second week or so, I bought a Swiss made bike from a sketchy flea market:
At WearHacks Toronto 2016, we were trying to think of project ideas, but nothing came to mind that we all agreed on. So Adrian, Austin and I went out for coffee to take a mental break. While at the coffee shop, we were talking about all the types of light humans are blind to. Wouldn’t it be amazing if we could the whole spectrum of light, including infra red and ultra violet? … maybe we can!
When we got back to the Bitmaker offices (where the hackathon was held), we checked out the available hardware to see if there were any IR cameras. Unfortunately, the only IR camera we could find was on the Kinect, and it’s IR camera only has intensity (not spectrum data). So we made due with what we had and decided to use both the IR and RGB cameras of the Kinect to render both layers on top of each other.
Josh, Austin and I worked on the translation algorithm while Kulraj and Nick created an SFML GUI. The colorspace transformation algorithm we ended up designing and implemented was to shift the visible light by 10 degrees (hue), then shift the IR spectrum into the first 10 degrees of red hue. Essentially, we squeezed the visible light from the non-IR camera so there was no red left, then put a red layer on top of that image to display the infra red data.
The hardware turned out to be pretty difficult to overlay. The IR camera and visible light cameras had significantly different resolutions and aspect ratios. So we had to program some magic to translate the IR data to fit on top of the RGB data. Because of this, the IR layer was slightly offset from where it should have been. For instance, a slight red glow would appear around our bodies (from our body heat), but it would be shifted one side, since the camera perspectives were not identical.
You can check out the project here: InfraViewer. But be warned, the code base is pretty messy!
So last weekend, I spent 36 hours at UofT for their annual hackathon. If you have not been to a hackathon, I highly recommend it! There are tons of like minded individuals making awesome projects.
Check out the swag I collected:
I even got a new domain name for free: http://squabbit.tech/. Yes, SquabbitTech is our “company” name :3
The application my team built was a super-fuzzy note searching tool called Fuzzy Wuzzy. The idea is, several years after writing down notes, it is very difficult to remember exactly what you wrote. So we built an application that guesses what you want to search for. We designed three algorithms, but only had time to implement two of them: synonym, misspelled word, and idea group matchers. For instance, searching for “small” will return notes containing the word “tiny”. We didn’t have time to create the idea group algorithm, but we had it planned out. If you search for “car”, everything that is a car will be matched (e.g. “Honda Civic”).
We built the project as an API using Flask – a micro web framework for Python. It is extremely simple in comparison to other frameworks like Symfony. It provides templating, routing, and database engines that are all super easy to get started with.
For the next 5 weeks, I will be sleeping from 23:00-2:00 and 4:00-7:00, totaling in 6 hours of sleep per day. This is known as the segmented sleep schedule. If it goes well, I will likely continue with this schedule. For some of you 6 hours probably sounds like a lot, but I usually get 8 hours, so cutting out 2 is a significant improvement.
My motive behind doing this is that I was recently offered a research opportunity that I could not refuse (I will likely post about it later). To make time for it, I had to sacrifice something – my choice was sleep. About a year ago I was on a similar routine and it worked well, but I haven’t had the motivation to start doing it again until now.