Convolutional Neural Networks (CNNs): An Illustrated Explanation

Artificial Neural Networks (ANNs) are used everyday for tackling a broad spectrum of prediction and classification problems, and for scaling up applications which would otherwise require intractable amounts of data. ML has been witnessing a “Neural Revolution”1 since the mid 2000s, as ANNs found application in tools and technologies such as search engines, automatic translation, or video classification. Though structurally diverse, Convolutional Neural Networks (CNNs) stand out for their ubiquity of use, expanding the ANN domain of applicability from feature vectors to variable-length inputs.

Continue reading

The Road to Hell is Paved with Good Intensions

A former colleague, a talented and accomplished user experience professional, recently wrote excitedly of her intension to attend an upcoming UX conference. It was a bit of a throwaway line, likely written in haste, but made in a public forum for consumption by contemporaries and customers alike. Her meaning was clear; the cringe from at least some in her audience equally so. Continue reading

How 1 Million App Calls can Tell you a Bit About Malware – Part 1

Recently, I collaborated with a number of researchers from the Software Systems Laboratory of Columbia University, on a study regarding POSIX (Portable Operating System Interface) abstractions. In a nutshell, we measured how and to what extent traditional POSIX abstractions are being used in modern operating systems, and whether new abstractions are taking form, dethroning traditional ones. The results of this study were presented at the 11th European Conference on Computer Systems (EuroSys ’16).

Continue reading

Diving into IBM’s Quantum Experience through your Browser

People tend to stay away when they hear the word “Quantum Computing”. The word itself gives the feeling that it targets scientists or physics researchers, but not your average person scrolling down in their newsfeed. However, quantum computing increasingly becomes more mature to kill its reputation as a hard field. Understanding quantum computing requires as much imagination as math or physics knowledge. In this post, I’m going to briefly spark your imagination about the next generation of computers and give you a glimpse of how IBM makes the experience accessible through your web browser; not access-restricted physics labs.

What is Quantum Computing?

Perhaps you are reading this blog post from your desktop, laptop, tablet or smartphone. All of these devices run on a traditional computer (or what we call: classical computer). Every piece of tech gadget you are using nowadays uses the concepts of classical computing. But what are classical computers and how are they different from quantum computers?  Continue reading

Presentation smells: How not to prepare your conference presentation

Recently, I was in Austin, Texas to attend ICSE (International Conference on Software Engineering) and MSR (Mining Software Repositories) conferences. The authors presented excellent papers on a variety of topics concerning software engineering. Despite their excellent technical content, I was discontented by the presentation skills exhibited by some of the authors. It’s not only the students, but even some of the experienced researchers gave not so exciting presentations. Continue reading