In July of 2018, Slack hosted its first ever official intern
Slackathon.
My team and I decided to tackle the hassle of bill splitting and payments by creating a Slack app that would
scan receipts upon upload and calculate the final breakdown of a bill after items are selected.
We used the Slack
Web API to send and receive actions from interactive messages,
Microsoft Azure's
computer vision API for reading items and their respective item values,
and botkit for callbacks.
Here are some demos of the bot functionality:
On app installation:
DM Command (split bill) creates new chat:
After receipt upload:
I also documented the highlights of my experience on
Medium.
In collaboration with Kyle Lim, James Kao, and Srividhya Shanker.
Imagical Website
Innovative Design Web Team
As a member of Spring 2018 Web Devlopment Team in Innovative Design
, I was responsible for the design and devlopment of Imagical's new contact page for both prospective members
and sponsors. The following are drafts created by Figma.
After getting the design approved, I started building the site using Gatsby, a React framework and static-site generator. I used bootstrap to create separate tabs for
each function of contact, and used the Airtable API to automatically
populate a table within an Airtable base on submit of each respective form. Additionally, I embeded the application
form into the website, as an upgrade to the former Word Document application process.
The website is currently hosted on Netlify, and the Github repository can be found
here.
In collaboration with Christopher Franco, Trevor Aquino, Mariel Aquino, and Ethan Lee.
Automeet
UI/UX Product Design
Automeet is an automatic event scheduler developed by Eliot Hsu. Automeet is implemented as a web app and
iOS/Android mobile app that generates events by detecting empty timeslots among participants' Google calendars.
I collaborated with Eliot in developing the Web UI and designing the Mobile UI, as well as creating the
Automeet branding:
I researched, designed, and iterated user flows for setup/onboarding, creating a
Pinteset board for inspiration:
I also created UI mockups for viewing and creating events, inviting friends, and configuring settings.
Open Computing Facility
Logo Redesign
The Open Computing Facility is a student resource at UC Berkeley since 1989 that has
provided thousands of students with free printing and computing. As Marketing Director of a senate office
in UC Berkeley’s student government (ASUC), I had the opportunity to redesign the decades-old
logo in an abbreviated and wordmark form.
I also created an informational flyer using the new logo, to help raise awareness on campus for the student resource.
Top: Progress/Iterations
Middle: Abbreviated
Bottom: Wordmark
PBL Design Committee
Graphic Design
In Fall 2016, I served as a Design Committee Member and in the following semester
as a Design Commitee Director for Berkeley PBL, an on-campus professional development
organization. Here is some of my work as a Committee Member, where I was assigned
weekly design assignments based on topics from color theory to typography.
As Director, I was also involved in a Dev Team of PBL alumni where I helped design and code
a website to showcase the Design Committee's projects. With two other members of the team, Frances Thai and
Emily Zhu, we created both low, medium, and
high-fidelity prototypes with Adobe Illustrator, Figma, and HTML/CSS. The project is also on
github.
Top to bottom: Low to High Fidelity
Pablo
Pablo is a Facebook bot created by the PBL Dev Team to send club-wide Facebook messages and personal
reminders as well as to pair people anonymously as chat buddies.
The following are variations of his profile picture and cover photo.
100 Days of Cats
Personal Project
100 Days of Cats is pretty much what it sounds like; one cat graphic a day,
for a consecutive 100 days. I was originally inspired by 100 Days of Fonts,
so I also included a font pairing with my simple graphic, which takes less than 30
minutes to execute. (Spring-Summer 2017)
In Summer 2018, I interned at Squishy Robotics , a mobile-sensing startup that uses
tensegrity robots to deploy as sensors in disaster-relief and rescue situations. In many emergency situations, first-responders
would often take a significant amount of overhead time to arrive and prepare for the scene (ex: HazMat suits for chemical hazards).
Additionally, human involvement can be dangerous in unmonitored, unknown situations. By deploying a robot equipped with sensors from a drone
to gather essential data about the surroundings (temperature, air, hazardous chemicals, etc...), first-responders can be better
equipped to handle emergencies and reduce potential injuries and casualties. This process is highlighted in the storyboard below:
My role as a UI Developer Intern was to create an intuitive user interface for first-responders to be quickly informed
about the status of the emergency situation and be warned of possible hazards. Additionally, I was able to use my design
experience to also help brand the company, from creating the logo, website, to even business cards and stationary.
Technology
In this full-stack project, I needed to connect the back-end sensor data, that was being parsed in Python,
to the front-end web application written in React.js. The back-end technology stack ended up including a database
to which the sensor data would be written to, a few interpolation functions that would clean and smooth the data,
and a web socket client in Flask that would emit the data upon query. This data was emitted through Socket.IO, and
received by React components that would dynamically load for display optimization. One of the design choices that I
made regarding the data query by the React client allowed for the option of a playback mode, which could replay time ranges
in history instead of playing real-time sensor data.
Dashboard UI
The following are screenshots of the three main dashboard pages. The Visual tab displays a 360 camera view using Three.js and
a map showing current GPS location as well as a polyline path using Google Maps Javascript API. The Sensor tab shows real-time (or history)
data per sensor type equipped on the robot, and uses Uber's react-vis for visualization. Finally, the Analytics tab displays a heatmap
per sensor type, also using the Google Maps API, as well as a react-vis barchart as a placeholder for predicting hazard probabilities.