Language:
switch to room list switch to menu My folders
Go to page: 1 3 4 5 6 [7] 8 9 10 11 ... Last
[#] Sun Mar 29 2009 23:19:35 EDT from Ford II @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

i may just be an idiot, but trying to setup an ftp server through NAT

is a pain in the ass.

Eeeek. Can you even do that? ftp is too cool for school and has this open-up-a-connection-back-to-the-souce thing built into it.
Really cool in the 70's not so cool now.
See if it's got some passive mode setting. You always want to use passive mode, that makes ftp use the original connection for everything, so as long as you can proxy that one connection out it should work.
If there's a global setting for passive mode do that, you never want to not use passive mode nowadays

[#] Sun Mar 29 2009 23:23:58 EDT from Ford II @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

What you're describing is almost impossible ... any chance you could do

some social engineering and find out what they're running?

I'm perfectly willing to accept that I'm crazy, this really is out of my field, I'll see who I can ask, but asking questions is just going to make me a suspect.
As it is, I think the network lady (who I actually like and I think I used to get along with) hates me because I was on a demo call and I kinda told it like it was. Basically the nice network lady had this guy in to pitch to us some do-all network utility monitoring system, I forget what it's called.
And at the time we were having database problems and I was on the call to see if this thing would help, I didn't know it was a network tool pitch or I wouldn't have bothered.
But since I was on the call I told them what I thought, and erm... well, that didn't go over very well with the network lady.
I didn't mean to piss her off, this was definetly something we need, and I said that, but it wasn't going to solve our current problem, so basically she got shut down on that whole network tool thing.
So I think I kinda want to get in her good graces again before I go asking her how to subvert her network.
Whatever they're doing though, they just made it across the board last week, because now my port25 thing is acting the same way as 443.

[#] Sun Mar 29 2009 23:54:51 EDT from IGnatius T Foobar @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

FTP through NAT is pretty much impossible unless the NAT engine knows how to spoof FTP in both directions.  The only reason this isn't even more of a problem is because FTP is so common that nearly all NAT implementations know how to do this.  Once in a while you run into a setup done by a wetback network admin who inadvertently turned it off.



[#] Mon Mar 30 2009 12:21:59 EDT from Peter Pulse @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

PASV mode doesn't use the original connection.. it still uses a second connection to transfer the data. But instead of the server connecting back to the client, the server waits on a socket and tells the client what socket number.. then the client opens the second connection. It solves the problem of a client that can make as many outbound connections as it wants but can't take inbound connections. FTP is actually kind of a clever protocol in some ways.. but it doesn't seem like it anymore because the cleverest thing you can do with it is something nobody does anymore.. which is, you can be on computer "A" and transfer a file between computer "B" and computer "C" by manipulating the modes.

[#] Mon Mar 30 2009 13:00:05 EDT from IGnatius T Foobar @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

FTP was designed during an era when there were very few firewalls and no NAT.
IPv6 will supposedly bring us back to the days of end-to-end everywhere, but I suspect there will still be people who insist on adding NAT into the mix anyway, probably for reasons that aren't very good. And even with passive mode there's still the issue of the server opened up a second inbound port for the data channel, but the firewall doesn't know that.

I suppose the way passive mode should have worked would have been: the server gives the client a cookie for the data connection, then the client connects to port 21 and identifies itself with that cookie, so then you have your separate socket for the data transfers, but it all runs over the same rendezvous port so there are no issues at the network edge.

Too late to change it now.

[#] Mon Mar 30 2009 16:48:16 EDT from IGnatius T Foobar @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

FTP is a UDP based protocol. You might want to scan using UDP in nmap.


Not a safe assumption! He might be using FTP-NG which opens a GRE tunnel to the other end.

[#] Mon Mar 30 2009 23:13:38 EDT from Ford II @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

PASV mode doesn't use the original connection.. it still uses a second

connection to transfer the data. But instead of the server connecting


Learn something every day.

[#] Mon Mar 30 2009 23:14:43 EDT from Ford II @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

I suppose the way passive mode should have worked would have been: the

server gives the client a cookie for the data connection, then the
client connects to port 21 and identifies itself with that cookie, so


Why couldn't they just send data and control information over the same socket.... Not that hard methinks. Saves all that extra connection handling stuff.

[#] Tue Mar 31 2009 12:02:15 EDT from IGnatius T Foobar @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

Of course, and if you use a modern protocol such as SFTP it multiplexes everything over the same SSH connection.

Back in 1980 when RFC765 was written, though, it was a different world. There was no NAT, there were probably no firewalls, and they thought they were doing something clever by using all those ports.

(Our forefathers didn't have telephones; they didn't have answering machines; and if they weren't there you couldn't talk to them. Isn't science wonderful.)

[#] Tue Mar 31 2009 13:29:48 EDT from error0x6 @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

for some reason, i keep reading that FTP uses TCP protocol so i've been setting all of my port openings for it to TCP connections. I may just be really tired though. i'll get to play around with it somemore maybe tomorrow.
Thanks for eveyones input, i'll let you know if i am sucessful

[#] Tue Mar 31 2009 13:35:27 EDT from Peter Pulse @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

The 1980 TCP/IP FTP was based on the earlier NCP FTP that had been running for years. And why should they add another level of multiplexing when they already had two levels of multiplexing going on. They had all those sockets, why not use them? That way they could leave the control connection as nice human readable text for controlling things and sending back progress info.. so you didn't even really need an FTP client, you could connect to it with a telnet client or a simple program. And then, by having separate ports, you could do what I was talking about earlier.. transfer files from one place to another while controlling it from a third place.

[#] Tue Mar 31 2009 14:14:28 EDT from IGnatius T Foobar @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

what I was talking about earlier.. transfer files from one place to
another while controlling it from a third place.

Today this is known as a "botnet" :)

[#] Tue Mar 31 2009 14:55:40 EDT from Peter Pulse @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

Funny :) I actually learned about this whole flexibility of FTP from working at the videotext place in the early-mid 90's. Ig will probably remember this project and maybe Ford. I had a videotext transmission system and all my code ran on NeXTSTEP and linux (.99!), and I had an ethernet network going within my part of the office and in my machine room (I was using KA9Q NOS to do a demand-dialed SLIP out to the linux boxes at the broadcast stations, because the linux demand dialer was thoroughly broken). Anyway, the guys who were generating the videotext content, the firmware updates etc, all the stuff to transmit.. they were using tools they had developed in Windows 3.1 and DOS.. all stuff that was supposed to be replaced by NeXTSTEP equivalents.
But it became apparent that those guys were not going to give up these DOS/Windows things.. so I had to make it all work together. So I got a copy of Novell's TCP/IP stack for DOS.. that worked alongside their regular IPX/SPX stuff. It had some stupid name like "DOS Services for UNIX". So initially I thought I would have their programs open a socket and talk to my programs. But it turned out to be a nightmare. So, the Novell stuff came with a very functional FTP and telnet.. so I shelled out and ran those to move stuff around and send messages. I used paper clips to run ethernet cable on the ceiling tile into their offices... It actually worked well. They would update some content and hit a button, and kazzam, less than a minute later the change would appear on the videotext unit. Anway, the point of the story is that Novell included a pretty cool Windows file browser that would let you move files locally within your own computer or from any computer to any computer just by dragging it. And I was perplexed at first as to how they did that, because we did not have a Novell file server or anything.
But they were doing it with FTP and multiple control connections.

[#] Tue Mar 31 2009 23:03:50 EDT from Ford II @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

(Our forefathers didn't have telephones; they didn't have answering
machines; and if they weren't there you couldn't talk to them. Isn't

science wonderful.)

Yes. Indeed. That was probably the best premonition for "progress" I ever accidently said.

[#] Tue Mar 31 2009 23:09:07 EDT from Ford II @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

I didn't know you could do the ftp between B and C from A. certainly neat stuff. Instead of bot-net I was thinking gnutella. Although today's gnutella isn't your father's gnutella, the layers of efficiency/complexity they've put into things like azureus is really really neat.
I vaguely recall that videotext stuff, more it makes me remember how far linux has come since those days.
And novell... I never got into (or had a reason) to play novell ,but you could just tell it was sharp. That was probably the last period in time (along with windows 3.11) people wrote really good systems.

[#] Wed Apr 01 2009 09:28:56 EDT from IGnatius T Foobar @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

I wish I'd had a chance to see the videotext system. It sounded cool (and still does).

[#] Wed Apr 01 2009 15:36:46 EDT from Peter Pulse @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

Yea it's sad that the project went under. We worked really hard on that, and though that project I learned the idiocy of the strict top-down approach.
We designed a huge, complex system.. and produced nothing useful. It was only toward the end, when it was just me and the engineers, that I started doing things bottom up, worrying about one problem at a time, testing as I went. Then things started to work. For example, initially I was working what they called the "collection" system.. the system which took data inputs from various sources (eg news feeds, weather station, train schedules etc) and fed them into the authoring system. This was a stupid, ill defined waste of time and money, because there was never enough money to build the whole huge design, and few actual examples of data sources I would actually have to interface with.. so I was toiling away on this abstract thing that couldn't be tested and would never be used. Once the (IBM trained) managers were fired, I started working at the other end of the system.. which was writing the code that talked over a modem to the DOS box at the broadcast station, which in turn interfaced to the box that inserted the data in the blanking interval of the video. So by writing something that could do that, I discovered major flaws in the code on that DOS box.. which ultimately could not be fixed in DOS but instead allowed me to get a linux box in there and rewrite everything to use sockets over a SLIP connection.. so we got that critical part of the system working.. both the broadcast end and the control end. Then worked my way backwards, writing or putting together whatever I needed as I went.. a queing system, a remote control application, a multiplexing setup (sort out what packets needed to go to what stations), a recovery system (keep track of what packets were in the queues in the broadcast stations, so that if the broadcast station computer was rebooted or even replaced, its state would be restored from my master location).. then getting it all connected back to the different content producing applications... Until we actually had something we could use to run a pilot project with a real station. Unfortunately, it was then that we found a major flaw in firmware of a mask programmed chip in the set top box.. and there was no money or time left to fix that. So it was over. But if we had worked bottom up the whole time, not wasted a year on some grand plan, the pilot project would have been started much earlier, we would have found the firmware problem sooner, we would have written a lot of stuff better. For example, the videotex screen editor that was written (at great expense) for NeXTSTEP would have probably been used rather than abandoned, if we had gotten it onto peoples desks right away instead of waiting for the completion of a gargantuan system around it. Instead, we used the prototype dos-based editor that had been used for so long and had so many tweeks that nobody would give it up. I have always preferred systems where you get the code running and see it running, and you go from there. Yes you have to have some overall plan first if you don't want a total hodgepodge.. but I think it works best when you have a basic plan at the top.. but then you start working at the botting, seeing each component working as you go...

[#] Wed Apr 01 2009 15:44:58 EDT from Ford II @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

want a total hodgepodge.. but I think it works best when you have a
basic plan at the top.. but then you start working at the botting,
seeing each component working as you go...

Here here!

[#] Wed Apr 01 2009 20:34:48 EDT from fleeb @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

Yep, that's pretty much how I prefer to go.

Sometimes, I will even write simple test programs for the more complicated algorithms and such that I want to implement within the context of a server or something, just so I know that the thing will work before I even drop it into the server.  That's served me so well, so many times, I never feel I'm cheating any time by doing it.

Certainly, my boss has never complained about that approach, unlike my last job, where it was considered a waste of time.



[#] Wed Apr 01 2009 21:21:44 EDT from Ford II @ Uncensored

[Reply] [ReplyQuoted] [Headers] [Print]

I'm sure I whined to you about this recently with my silly plugin problem, nearly impossible to debug shared object in running iplanet webserver, so I managed to build standalone program and link in all of the iplanet libraries so it actually builds and runs so when I run it as plugin it all just works.

Go to page: 1 3 4 5 6 [7] 8 9 10 11 ... Last