1. Web & Backend
  2. macOS & iOS
  3. Physical & Electronics
  4. Graphics
  5. Audio/Visual
  6. Compilers/Language tools
  7. Talks etc.

A portfolio is usually an artist’s or designer’s way of showcasing their work; it’s not very usual for Developers or Engineers to have one. Still, since most of the things I’ve worked on are not easily accessible (you either need an entry ticket to a faraway museum, or a time-machine), I thought I’d document some cool things I’ve done during my career as a Software Developer.

Don’t hesitate to contact me at orestis@orestis.gr if you want to discuss any of this! Everything listed below is work I have completed over many years, both in professional and personal contexts.

Web & Backend

I’ve developed a host of different CMS for various projects using the Django web framework. In recent years, I’ve embraced Javascript front-end development by relegating Django to a REST role and using Typescript for crucial pieces of the UI. In many occasions I avoided bloat by interfacing with the various DOM classes directly (e.g. for file uploading, drag-and-drop, client-side image cropping etc).

I’ve also developed various UI-first web applications with React, Ember and good old VanillaJS 😉. I’ve written both custom CSS to match existing designs and also used custom Bootstrap themes.

For more advanced back-end functionality like controlling dozens of machines, synchronizing content, and running repeated tasks with fine grained control, I’ve used the Python Twisted framework extensively, while recently I’ve moved to Elixir for the benefit of having a more robust VM and true multi-core support.

In 2017, for the Exhibition Management System of the Canada Science and Technology Museum, I went a step further and exposed key Prometheus metrics, a Grafana exhbition health dashboard, as well as a simple in-house analytics system based on the ELK stack.

macOS & iOS

Apart of the low-level frameworks mentioned elsewhere, I’ve written a lot of Mac interactives using plain Cocoa in Objective-C, with custom-styled elements and a UI adapted to touchscreens.

Around 2012, the Chameleon project was announced; an ambitious project to port UIKit to the Mac. I’m maintaining a private fork that adds proper multi-touch & gesture support, and I’ve written a lot of UI and design-driven interactives with it.

Using UIKit for interactives development unlocked a new level of functionality and UI customization over what Cocoa can offer; This led to some quite interesting and beautiful apps, with full multi-touch and animation support. And since UIKit is a thin layer on top of low-level frameworks I was already comfortable with, I could be productive from day one.

Some examples of the interactives I’ve written with UIKit/Chameleon are:

Apart from the UIKit/Cocoa split, most of the Apple frameworks I’ve used for clients are truly cross-platform; some examples include: CoreImage, SpriteKit, SceneKit, GameKit. Some other frameworks like CoreText were originally macOS only and only recently made their way to iOS. Plus of course, CoreAnimation and CoreGraphics are almost equivalent for both.

So far I’ve used only Objective-C (and PyObjC, for early projects) for all my macOS & iOS work. Swift was just too immature for the kind of budget and deadlines I had to face. However, in the spring of 2016 I had the opportunity to tutor a university student on how to make multiplayer games for iOS; Using Swift was a natural choice as it was much more approachable. I helped him develop a simple iOS game with SpriteKit, GameKit and the Multipeer Connectivity framework.

I really enjoy quick iteration and collaboration with UI/UX designers; some times I also have good ideas that get adopted in the final product. Taking inspiration from the iPhone’s (then recent) "slide-to-unlock" paradigm together with the diagram of a car’s gear stick, I proposed the "drag-the-dot" interface that we used to slow down interactions and promote discussion between student groups. I also had to write the code that keeps the dot on an arbitrary bezier curve, that was also fun! In another case, I proposed and implemented a "quick-zoom" feature when visitors press on a dense global map and by Fitts’s law we know there’s a high change for a miss; the solution was to zoom in progressively until the various targets are large enough to be unambiguous.

Physical & Electronics

Having parted with my tiny vinyl collection, I came up with the Faux Vinyl project, which I use daily in my living room. NFC tags are sandwiched between a printed album cover and card stock, and contain the Spotify ID for that album. Using an off-the shelf NFC reader and the native CryptoTokenKit, plus a little bit of AppleScript, I can again play my favorite music without using any kind of screen interface.

I’ve written interactives that integrate Phidgets and custom Arduino-based sensors and electronics. I’ve also given a 3-hour "Introduction to Programming" workshop to teenagers based on the Arduino platform using some hand-soldered prototyping boards with LEDs, potentiometers, buzzers etc.


In 2009, for the Museum of Australian democracy, I implemented most of the core functionality of the "Timeline of Australian Democracy" interactive. This work was based on the CoreAnimation, CoreGraphics & CoreText frameworks. At that point in time, there were very few resources available so I spent a lot of time reading documentation, experimenting, profiling and debugging, in order to produce a fluid and robust app.

For the same project, I implemented the map element of the "Challenges" interactive. This was based on a custom parser that converted shapefiles to CoreGraphics path objects.

In 2010, for the TRAIL project, I implemented a limited, yet reasonably complete SVG parser in Python/Obj-C — again, converting the various SVG elements into CoreGraphics paths. This enabled a workflow where student activities designed in Illustrator could be turned into "smart templates", which also unlocked a WYSIWYG CMS interface for the museum staff.

I was pleasantly surprised to find out that the HTML5 Canvas API is extremely similar to CoreGraphics; to the extent that a lot of code can be trivially ported between the two. I’m eager to explore this area in UI development.


For the 2011 TELUS Spark and 2012 Science of Rock’n’Roll I did a lot of work with the CoreAudio and AVFoundation frameworks.

This included multi-channel audio playback, synchronised audio and video recording (with different devices, which is harder than it should be), and applying live sound processing to musical instruments. The most complex app was a Karaoke app that had to playback the backing track, record the singer’s voice, display live VU-meters, all the while playing back a different monitor mix — and produce ready-to-share video file seconds after the song finishes.

I’ve also written an interactive that recorded visitors playing with a Reactable installation — keeping only the last 30 minutes. I created a trimming editor component that visitors used to share their personal performance clip online — using a custom built video processing pipeline to compress and upload the files to Vimeo.

Compilers/Language tools

Back in 2008, I was working at Resolver Systems (transformed since to Python Anywhere), working on a Pythonic spreadsheet in .NET. I got curious about parsing programming languages, which seemed quite esoteric, and in my spare time I coded up PySmell, which was an autocomplete helper for Python. It was a quite fun experience, which results in me winning a "Best Lightning Talk" award in Pycon UK 2008. I also wrote a short article for Python Magazine #18, March 2009.

Talks etc.

I’ll be speaking about writing TCP/IP applications in Elixir at ElixirConf EU 2018.

I’ve given a talk on PyObjC in EuroPython 2009, and a four-hour tutorial on Twisted in EuroPython 2011 which had to be repeated due to popular demand! I’ve also kick-started the Python-Greece user group which now has a life of its own.

Copyright © 2017 Orestis Markou | orestis@orestis.gr | orestis.gr