Dr. Mark Humphrys

School of Computing. Dublin City University.

Home      Blog      Teaching      Research      Contact

Online coding site: Ancient Brain

coders   JavaScript worlds


CA170      CA668      CA686

Online AI coding exercises

Project ideas

Remote and Network Computing

Users have been accessing sites remotely since the earliest days of the network.

telnet and ftp (and their secure successors)

These two commands and their secure successors have been for decades the two fundamental commands of the Internet / remote computing.

  1. telnet (host) - Login to remote host
  2. ftp (host) - Transfer files to/from remote host
With telnet you get a command-line, with ftp you get a read-write file system.


  1. telnet, 1971 (and here).
  2. ftp, 1971.

These two commands have been replaced by secure versions:

  1. telnet -> ssh
  2. ftp -> sftp / scp (both use ssh) or ftps (uses ssl)

DCU remote access for students

ssh and sftp to:

= (on a CA subnet)

How to login to Linux at DCU

Accessing Unix remotely and from Windows

Windows GUI, Unix ssh, Unix file system

I can use the following two to run a Windows GUI with a Unix ssh command-line underneath. My files on the remote Unix server appear as just another read-write Windows drive. I can use Windows apps to edit them. And I have a Unix ssh command-line always open on which I can run scripts to process them:

Unix GUI, visible remotely

Run Unix GUI program on remote Unix machine:

Run Unix GUI program on remote Windows machine:


FTP scripting

FTP (sftp / scp / ftps) scripting is a good/essential tool for website maintenance.

Say you are working on an offline copy of a website of 10,000 files. You make changes to 137 files among the 10,000. You want to upload the 137 new files, but not the entire 10,000. Drag and drop the 137 to their correct destinations could be quite tedious.

For repetitive tasks, drag-and-drop is not a better interface than being able to write automated scripts (this will be a theme of this course). You can write ftp scripts ("macros") and call them from Shell scripts:

Sample ftp scripting

This is how I work:
To make a new build of my (huge) genealogy site:
I just want to upload the small number of files that have changed since the last build.
I run a program to build an ftp script with all the changes, like the ftp script below.
I then run sftp with this ftp script.

cd public_html
lcd /users/local/humphrys/local/history-copy
mkdir ./Flanagan/NMI.bird
put ./Flanagan/NMI.bird/IMAG1024.jpg ./Flanagan/NMI.bird/IMAG1024.jpg
put ./Flanagan/NMI.bird/IMAG1025.jpg ./Flanagan/NMI.bird/IMAG1025.jpg
put ./Flanagan/the.bird.html ./Flanagan/the.bird.html
put blog.html blog.html

Changes directory on remote site.
Changes directory on local site.
Makes remote directories.
Puts various files into various destinations.

The above ftp script is automatically built by a program.

HTTP scripting

You can also do HTTP GET or POST scripting from the command-line.
Some tools that do this:

  1. wget

    Advanced wget:

    • To crawl an entire site or check for broken links, something like:
        wget --spider --force-html -i file.html
    • Sites that block programs:
      If a site won't let scripts/programs see its content (only browsers), you can set User agent to pretend to be a browser:
        UserAgent="Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1)"
        wget -q -O - 	-U "$UserAgent"	URL

  2. lynx
    • does HTTP GET:
        lynx -reload -source URL
    • does HTTP POST:
        cat DATA | lynx -reload -source -post_data URL

  3. curl
    • dumps to command line by default
    • -s for silent mode
    • does HTTP GET
    • does HTTP POST

Working remotely

Idea: Your files are "on the network" ("in the cloud") somewhere. You can access them and make changes to them from anywhere. All copies stay in synch.

This is what you actually have within DCU (can move from terminal to terminal, accessing files at central server).

Various ways of working:

ancientbrain.com      w2mind.org      humphrysfamilytree.com

On the Internet since 1987.

Wikipedia: Sometimes I link to Wikipedia. I have written something In defence of Wikipedia. It is often a useful starting point but you cannot trust it. Linking to it is like linking to a Google search. A starting point, not a destination. I automatically highlight in red all links to Wikipedia and Google search and other possibly-unreliable user-generated content.