?

Log in

ftp mput recursive and/or curl question - Linux Help Desk [entries|archive|friends|userinfo]
linuxsupport

[ userinfo | livejournal userinfo ]
[ archive | journal archive ]

ftp mput recursive and/or curl question [Feb. 15th, 2010|03:26 am]
linuxsupport

linuxsupport

[tonytraductor]
[Current Mood |perplexed]

I was trying to move some stuff up onto my server, and, I thought I'd like to try and move some stuff without using a gui ftp program or my host's online cpanel tools ( (blecch) for once.
Mostly, because I want to be able to script what I'm doing for future use.
Now, I've used gui ftp clients (and even wrote a little tcl one for quick jobs), but the one I wrote
will only move one file at time...I recall not being able to figure out how to send a dirfull, recursively, in fact, when making the little guy.

Now, I know it's possible to move a whole directory at a time, because thousands of existing gui ftp clients do it.
But I don't seem to be succeeding.
First, I can't find an ftp command (this using bash ftp on debian lenny, not using the tcl/ftp I used to write my little thingy) to move a whole directory, recursively.
I can give it a wildcard, and it will load up all the distinct files in a dir, but it won't send another dir within the dir and files therein, recursively, as I want.

I also thought I'd try curl.

I did something like this:
###########3
#!/bin/bash

# sending up the nanoblog

echo "writing nblist"
find nanoblog/* > nblist
echo "list written \nnow sending files..."
for i in $(cat nblist)
do

curl -v -u me:pwd -T --ftp-create-dirs $i ftp://ftp.baldwinsoftware.com/nb/

done
echo "files sent"
exit
################



It appears to function in the command line, but when I checked the remote files (html pages) on the server with a normal browser,
it apperas as though something has gone amiss.

I was expecting it to overwrite existing files on the remote server.
But it seems to hose the files, or something.
If I could get that to work, I'd like to make it only send/overwrite stuff to the remote server when the local file is new (not on the remote),
or newer (exists on remote, but was edited more recently, locally), and I don't seem to have found how to do that.
-Z seems to have something to do with download/getting files and establishing some time parameter, but I don't see how to
read the timestamp from the server, and if the local file is newer, go, and if not, abort, so, what I have, to my knowledge,
will overwrite all files, whether they've changed or not. That's not efficient.


Also, I wonder if there is a better way to "glob" the file list, so I'm not sending 35 separate logins and curl requests, but
one for a whole bunch of files.
Rather than find and for i in $(cat list) loop, I thought of the following
(didn't try it yet, because I'm falling a asleep at the keyboard, now)
#############
#!/bin/bash

# sending up the nanoblog

echo "writing new file list"

for i in $(ls -1 nanoblog/)
do echo $i,\ >> nblist
done
# that gives me a list, one item per line followed by comma and space

# but to glob it in the curl command, I believe I need one line,
# so let's remove the line breaks:

perl -pi -e 'tr/[\012\015]//d' nblist
# there might be an easier way to do that with tr or sed or something...

echo "file list written, now sending files..."

flist = $(cat nblist)
# didn't think curl -T {$(cat list)} would be smart

curl -# -u me:pwd -T {$flist} ftp://ftp.baldwinsoftware.com/nb/

done
echo "files sent"
exit
###############
aargh...my brain hurts...



So, my questions are:

Is there a way to send a dir and it's contents, including sub/dirs, recursively, via ftp in command line?
And, if so, what is it?
(I'm not finding that in the man page, and, I did some googling before coming to ask, but
I only found info about recursive mget, and when I tried to apply it to mput, did not achieve the desired result).

and/or

Why do my curl efforts not give me the desired result?

One more:
I found wput, which will send whole dirs, and recursively, but I don't see in the man where it authenticates on the server.
I need to login to ftp up (no anonymous on my server, no way).

-
I did end up sending everything up with gftp for today, incidentally, but, I will be updating
these pages frequently, and would rather be able to script it and do it from the command line,
in fact, nanoblog, to my knowledge, can call a script to publish the darned thing, if I make a suitable
script and put it in the conf, so, yes, I do, very much, want to learn how to accomplish recursive putting of files to my server,
via the command line, for future use.

I'd like to have a script, really, that will send everything up, only overwriting existing files on the remotes server
when the local file has been touched more recently.

Thanks,
Tony
LinkReply

Comments:
[User Picture]From: zastrazzi
2010-02-15 03:03 pm (UTC)
Just a quick question, does your host support ssh? And if yes, you might want to take a look at using either rsync or scp. Alternately if it doesn't support ssh inbound, does it provide you with a shell of some kind via web interface? If it does, you may be still be able to use rsync/scp by starting the copy from the host.
(Reply) (Thread)
[User Picture]From: tonytraductor
2010-02-15 03:51 pm (UTC)
Oh, yeah, if I could sshfs mount the server to here, or vice-versa, rsync would be pretty ideal.

Unfortunately, I do not have shell access to the server (using godaddy.com, due to mucho storage space, tiny price, although, I've learned I could get similar deals on hosts that do offer shell access, so, probably when my contract is up with godaddy, I'll shop around...possibly).

nice icon (said toasting you with a steaming cup of <a href="http://cafepilao.com.br>Cafe Pilão</a>, o Cafe Forte do Brasil)
(Reply) (Parent) (Thread)
[User Picture]From: zastrazzi
2010-02-15 04:31 pm (UTC)
Coffee, I love the stuff. Currently working on a fresh press of the Guatemala Finca La Viña via Phil & Sebastian

I actually prefer it to the Nicaraguan CoE they have when I'm making coffee at home, although the Nicaragua La Picona Cup of Excellence #4 is simply amazing off their Clover. Wow.

On a more on topic note, are you restricted to using ftp? Have you considered using ncftp or lftp instead?

http://bash.cyberciti.biz/backup/copy-all-local-files-to-remote-ftp-server-2/
(Reply) (Parent) (Thread)
[User Picture]From: tonytraductor
2010-02-15 06:19 pm (UTC)
You know, I did look at ncftp last night, briefly.
I think that might be a very valid solution, especially after looking at that script.
If I had shell access, I'd just sshfs and rsync, which, imho, would be the most ideal solution, but that's not an option, unfortunately.
The Brazilian coffee I drink has a strong, rich, velvety, almost chocolatey flavor, but has very low acidity (unlike most guatemalans, nicaraguas, colombians, etc.)
About the only thing possibly better, imho, is, maybe, some jamaican blue mountain coffee.

thanks for your assistance!
(what I'm playing with is nanoblog, which is pretty cool.
Results of early experimentation, just uploaded with gftp, in this case, are here: http://www.baldwinsoftware.com/nb/index.html)
(Reply) (Parent) (Thread)
[User Picture]From: tonytraductor
2010-02-16 01:18 am (UTC)
I friended you.
(Reply) (Parent) (Thread)
[User Picture]From: tonytraductor
2010-02-16 06:08 pm (UTC)

append

I didn't use anything quite like that script, because, in reality, it's really much simpler.

just:
ncftpput -u myname -p password -m -A -R www.myserver.com remote_dir locale_files

that's all that's needed.

But I do have what may be perceived by some as a stupid question.

I am using the "-A" option, for "append".
Am I correct in the assumpt that this append means, if the local file and remote file are identical, there is nothing to do? i.e., basically the remote files will be updated (new material from the local file will be "appended" to them, similar to rsync -u or cp -u)?
(Reply) (Parent) (Thread)
[User Picture]From: sapphorlando
2010-02-15 04:58 pm (UTC)
One useful bit of code to know: cut-tag

Please
(Reply) (Thread)
[User Picture]From: tonytraductor
2010-02-15 05:07 pm (UTC)
okay.
I cut the entry.
(Reply) (Parent) (Thread)
[User Picture]From: tonytraductor
2010-02-15 05:03 pm (UTC)
I found wput (http://wput.sourceforge.net)
which will send a dir/dir recursively, but I can't find in the man where it authenticates with the server (and I don't enable anonymous ftp uploading to my server, of course).
(Reply) (Thread)
[User Picture]From: simoncion
2010-02-16 01:02 am (UTC)
Install FUSE and try this: http://curlftpfs.sourceforge.net/

With this you can mount FTP directories into your filesystem and cp and rm and mv to your heart's delight.
(Reply) (Thread)
[User Picture]From: tonytraductor
2010-02-16 01:15 am (UTC)
But fusermount requires shell access.
I don't shell access to the remote host.
I know, because I use sshfs between my laptops and my main desktop all the time. This would be ideal if I had shell access, of course.
Then I'd just use rsync.
Thanks for the suggestion, all the same.
(Reply) (Parent) (Thread)
[User Picture]From: tonytraductor
2010-02-16 01:21 am (UTC)
oh...wait...
Maybe this will work.
It appears that it's using ftp, but imitating mounting of a remote fs.
I'm going to have to read the man/docs a bit more, but this might just do the trick.
Somebody else also mentioned ncftp, which might do it.
I tried looking at wput (like wget, only backwards), but that doesn't authentic on the remote host.
This culrftpfs looks easier to do than the ncftp, even.
(Reply) (Parent) (Thread)
[User Picture]From: simoncion
2010-02-16 01:56 am (UTC)
> It appears that it's using ftp, but imitating mounting of a remote fs.

Exactly. Here's the basics of using the tool:

curlftpfs -s -o user=username:pass,fsname=whatever remote.host/some/optional/path /local/mount/point
(Reply) (Parent) (Thread)
From: qiwalego
2011-04-13 03:41 am (UTC)
This blog is bookmarked! I really love the stuff you have put here.

(Reply) (Thread)