Bisection or Binary logic is an example of a simple yet powerful idea in computer science that has today become an integral part of every computer scientist’s arsenal. It stands synonymous to logarithmic time complexity that is no less than music to a programmer’s ears. Yet, the technique never fails to surprise us with all the creative ways it has been put to use to solve some tricky programming problems. This blog-post will endeavour to acquaint you with a few such problems and their solutions to delight you and make you appreciate it’s ingenuity and efficacy. Continue reading
Tag Archives: Algorithms
Traceroute to the Front Door: Trimming Public Net Hops
Traceroute is a wonderful computer networking diagnostic tool. This article will attempt a traceroute script written in Python, including a customization that can identify just the routers within your private network (from the host up to and including the public/internet gateway). Continue reading
Is Your Research Reproducible?
When a scientific experiment achieves the expected result, researchers hurry up to draft a manuscript, submit it and cross their fingers for acceptance. When that paper gets accepted for publication – a happy camper! However, and not much later, the researchers discover that it was a fool’s paradise. Their work never gets cited by peers. Often times, simply because others cannot reproduce their scientific experiment, i.e. they cannot compare it to their own experiments. There are few reasons that block research reproducibility. In this post, I will preview some of them that frequently appear in the field of computational science. Continue reading
Providing feedback in the classroom
In my previous post, I discussed some current and ongoing research on effective pedagogical approaches to STEM education. The problems in STEM education have gained much attention recently due to the growing gap between demand and skill in American STEM jobs, likely due at least in part to lack of interest or discouragement among American students. Continue reading
Big Data, Communication and Lower Bounds
As the size of the available data increases, massive data sets cannot be stored in their entirety in memory of a single machine. Furthermore due to the small amount of memory and computation power available on a single node, one needs to distribute the data and computation among multiple machines when processing tasks.
However, transferring the big data is very expensive. In fact, it is more expensive than the computations on the datasets. Thus, in the distributed model, the amount of communication plays an important role in the total cost of an algorithm and the aim is to minimize the amount of communication among processors (CPUs). This is one of the main motivations to study the theory of Communication Complexity, which originates from Big Data processing. Continue reading