When I started using computers regularly (around 1983), there was only the command line. You typed in commands, and you got text in response. By 1998, through some combination of Xerox, Apple, RISC OS, and Windows, I (and most of the rest of the world) was doing everything with a graphic user interface. Psychologists were writing at the time how GUIs were a much better way to interact with computers.
For some tasks, they were right, and GUIs certainly reduce 'entry cost' for most tasks, leading to nearly everyone now being a computer user. But, for research, is often worth trading a slightly higher entry cost for a longer-term increase in power, speed, and a reduction in errors. By about 1998, I ended up doing analyses in Excel that I would previously have done in BBC BASIC and, while that felt like progress, it turned out to be a bit mistake.
Excel is not a good way to pre-process or analyze data. The chances for human error are great, and most operations require substantial clerical effort. Data and analysis are not kept separate, making it hard to document, reproduce, and reuse analyses. The same goes for SPSS, at least if you use the GUI rather than syntax.
In 2012, R got me back into command-line data analysis, and it's fantastic. But it goes way beyond this. The additional insight was to use an operating system that has a proper command line interface. This means Apple OS X, or Linux. Then, pretty much anything you do on a computer (other than the directly artistic e.g. illustration) can be done better, faster, and with less errors, using the command line interface ('terminal') rather than the GUI. As ever, getting started on task X with the command line takes longer than with a GUI, but for any task you do more than about once a week, the time invested quickly pays off.