Processing large datasets or working with huge data is a usual stage in almost any kind of research. The problem appears when processing these data requires long processing time on any kind of personal laptop. In order to solve that issue, we have at our hands the possibility of using any major compute cloud provider in order to accelerate that processing stage wasting minimal time and resources. Cloud computing allows us with a few clicks to build a computer cluster, process our data, get the results and destroy that cluster for only a few dollars.
Agile methods on software development are very popular. The root of these methods must be sought in the on the Agile Manifesto that says:
In short, these lines says that the world is always changing, the adaptation to change, the collaboration between people and the transfer of knowledge should be a priority in our professional life.
These objectives fit with software development but also with research because, we software developers and researchers are always looking for new challenges, always updating our professional network and making improvement in science.
In the last months, I conducted a few usability studies and upon reflecting on these I decided to share my experience as it might be helpful to anyone starting on usability. This article attemps at summarizing my experience and thoughts on usability experiments.
When trying to start a usability study or experiment, the practitioner or researcher must answer some initial questions about their future work.
Regarding your research, in general, the most important question to answer is “What is my motivation or why I am doing it?”. In a few words, as a researcher, you must not only formulate your research question but also, its answer.
Research methods are here to help you create and solve a new question on usability, user experience and also, on human-computer interaction.
Artificial Neural Networks (ANNs) are used everyday for tackling a broad spectrum of prediction and classification problems, and for scaling up applications which would otherwise require intractable amounts of data. ML has been witnessing a “Neural Revolution”1 since the mid 2000s, as ANNs found application in tools and technologies such as search engines, automatic translation, or video classification. Though structurally diverse, Convolutional Neural Networks (CNNs) stand out for their ubiquity of use, expanding the ANN domain of applicability from feature vectors to variable-length inputs.
To introduce you to Exascale computing, as well as its challenges, we interviewed the distinguished Professor Jack Dongarra (University of Tennessee), an internationally renowned expert in high-performance computing and the leading scientist behind the TOP500 (http://www.top500.org/), a list which ranks supercomputers according to their performance.