Very cool video on HCI
Here is a nice article on mashable that addresses many of the issues encountered when trying to develop content for multiple devices. While its a basic (high level for the most part) article it does point out some major points on computer vs tablet vs mobile design and is well worth the read:
Here is a chart I made comparing speeds of various internet connections and showing how fast/slow one would expect their files to download. Remember that these are really hypothetical and depend so much on where you are, what kind of device you have, etc. Theoretically 4G and high speed internet should be nearly the same but realistically this is not the case. So please do not take these as fact, rather, use this chart as a guide to consider when developing for both web and mobile devices.
|56k Modem||15 seconds||36 seconds||1 minute||2.5 minutes||1 day 18 hours|
|3G||<1 second||6 seconds||12 seconds||25 seconds||7 hours|
|High Speed/Wireless/4G||<1 second||1 second||3 seconds||5 seconds||1.5 hours|
*note – these speeds assume best case scenario. More than likely all speeds will be
Slower. Both 3G and 4G speeds will probably be significantly slower, especially as 4G max
Speeds are not even available yet.
What is HCI?
“Human-Computer Interaction (HCI) is the study and the practice of usability. It is about understanding and creating software and other technology that people will want to use, and will find effective when used. The concept of usability, and the methods and tools to encourage it, achieve it, and measure it are now touchstones in the culture of computing” – Carroll (2002)
HCI is how we interact with computer hardware and software. Not just computer though, but all machines – such as your car, dishwasher, airplane etc. The point of HCI is to make these technologies more ‘user friendly’ so that they are easier for us to interact with both physically and mentally. The key to HCI being usability – so expect more blog posts on usability.
Well this is very interesting. This computer system can determine when a persons brain activity is being overloaded and in return adjust the computer interface to take that load off of the user. While I am not sure this works (need to see this research peer reviewed and read it), I am very intrigued by the possibilities and promise this holds in both the cognitive load and human computer interaction research.
“Their system, called Brainput, is designed to recognize when a person’s workload is excessive and then automatically modify a computer interface to make it easier. The researchers used a lightweight, portable brain monitoring technology, called functional near-infrared spectroscopy (fNIRS), that determines when a person is multitasking. Analysis of the brain scan data was then fed into a system that adjusted the user’s workload at those times. A computing system with Brainput could, in other words, learn to give you a break.”
More of the article here: http://mashable.com/2012/05/14/brainput/