I was hired in late October 2017 as a Mobile App Developer for Rutgers University Department of Continuing Studies (DoCS). I was immediately given a project of delivering an iOS application for a conference that was being held on March 12, 2018.

The app itself (RUOnlineCon on the App Store) RUOnlineCon - Apps on Google Playis a game where users can complete tasks (challenges) which require use of the camera and they can upload photos to be moderated by admins. Once their images are approved,  the user will receive points.

The objectives list, showing users tasks, with their respective point value, and statusThe image above shows a customized TableView in iOS with Custom Cells and expandable sections. The section headers show the Objective Category name, and the numbers in the header indicate how many tasks are completed. The image shows 0/5, and the circular  ‘!’ icon indicates that this task has been submitted but is awaiting approval from moderators.


An objective with an overlay image

The most exciting development challenge for me in this app was composting an overlay image on top the photo the user takes for certain objectives,  such as the one pictured above. This involved stitching together  the images into a single CGRect and passing this back to the Camera Context to be uploaded as an image into a Firebase storage bucket. I spent most of my time with the AVFoundation Framework figuring out how to do this.

The photos that users take become pieces of a real time photo mosaic, which can be seen in action briefly in this YouTube video:

2018 Rutgers Online Learning Conference Highlights - YouTube

Mobile iOS app shows individual contributions

Users can view other user’s submissions, and comment on them. The web app displays the leaderboard and the mosaic, as well as the twitter feed for tweets containing the hashtag “RUOnlineCon”.

The mosaic itself was an algorithm implemented in a single Firebase Cloud Function. Firebase Cloud Functions have imagemagick available to them, so I used a series of random coordinates generated in this function to determine the images location on the screen. This section of the “background image” on the website was composited with the user’s photo. The leaderboard calculation was also implemented as a stateless cloud function on Firebase. By implementing these jobs as “serverless” cloud functions they end up being more scalable than when implemented a REST API endpoint, since they are distributable and deployed on Google’s infrastructure as  fault tolerant jobs.

My role

I was responsible for the entirety of the iOS app, the cloud functions, implementing the web app logic for rendering the leaderboard data items, the layout of the mosaic itself and the client side javascript code to update the appropriate <div>’s, as well as assisting our Android developer with some critical Java code involved with the camera photo capture action.

I ensured the deployment ran smoothly and monitored the logs for the cloud functions and the database during the conference itself.