i may just be an idiot, but trying to setup an ftp server through NAT
is a pain in the ass.
Eeeek. Can you even do that? ftp is too cool for school and has this open-up-a-connection-back-to-the-souce thing built into it.
Really cool in the 70's not so cool now.
See if it's got some passive mode setting. You always want to use passive mode, that makes ftp use the original connection for everything, so as long as you can proxy that one connection out it should work.
If there's a global setting for passive mode do that, you never want to not use passive mode nowadays
What you're describing is almost impossible ... any chance you could do
some social engineering and find out what they're running?
I'm perfectly willing to accept that I'm crazy, this really is out of my field, I'll see who I can ask, but asking questions is just going to make me a suspect.
As it is, I think the network lady (who I actually like and I think I used to get along with) hates me because I was on a demo call and I kinda told it like it was. Basically the nice network lady had this guy in to pitch to us some do-all network utility monitoring system, I forget what it's called.
And at the time we were having database problems and I was on the call to see if this thing would help, I didn't know it was a network tool pitch or I wouldn't have bothered.
But since I was on the call I told them what I thought, and erm... well, that didn't go over very well with the network lady.
I didn't mean to piss her off, this was definetly something we need, and I said that, but it wasn't going to solve our current problem, so basically she got shut down on that whole network tool thing.
So I think I kinda want to get in her good graces again before I go asking her how to subvert her network.
Whatever they're doing though, they just made it across the board last week, because now my port25 thing is acting the same way as 443.
FTP through NAT is pretty much impossible unless the NAT engine knows how to spoof FTP in both directions. The only reason this isn't even more of a problem is because FTP is so common that nearly all NAT implementations know how to do this. Once in a while you run into a setup done by a wetback network admin who inadvertently turned it off.
IPv6 will supposedly bring us back to the days of end-to-end everywhere, but I suspect there will still be people who insist on adding NAT into the mix anyway, probably for reasons that aren't very good. And even with passive mode there's still the issue of the server opened up a second inbound port for the data channel, but the firewall doesn't know that.
I suppose the way passive mode should have worked would have been: the server gives the client a cookie for the data connection, then the client connects to port 21 and identifies itself with that cookie, so then you have your separate socket for the data transfers, but it all runs over the same rendezvous port so there are no issues at the network edge.
Too late to change it now.
FTP is a UDP based protocol. You might want to scan using UDP in nmap.
Not a safe assumption! He might be using FTP-NG which opens a GRE tunnel to the other end.
PASV mode doesn't use the original connection.. it still uses a second
connection to transfer the data. But instead of the server connecting
Learn something every day.
I suppose the way passive mode should have worked would have been: the
server gives the client a cookie for the data connection, then the
client connects to port 21 and identifies itself with that cookie, so
Why couldn't they just send data and control information over the same socket.... Not that hard methinks. Saves all that extra connection handling stuff.
Back in 1980 when RFC765 was written, though, it was a different world. There was no NAT, there were probably no firewalls, and they thought they were doing something clever by using all those ports.
(Our forefathers didn't have telephones; they didn't have answering machines; and if they weren't there you couldn't talk to them. Isn't science wonderful.)
Thanks for eveyones input, i'll let you know if i am sucessful
what I was talking about earlier.. transfer files from one place to
another while controlling it from a third place.
Today this is known as a "botnet" :)
But it became apparent that those guys were not going to give up these DOS/Windows things.. so I had to make it all work together. So I got a copy of Novell's TCP/IP stack for DOS.. that worked alongside their regular IPX/SPX stuff. It had some stupid name like "DOS Services for UNIX". So initially I thought I would have their programs open a socket and talk to my programs. But it turned out to be a nightmare. So, the Novell stuff came with a very functional FTP and telnet.. so I shelled out and ran those to move stuff around and send messages. I used paper clips to run ethernet cable on the ceiling tile into their offices... It actually worked well. They would update some content and hit a button, and kazzam, less than a minute later the change would appear on the videotext unit. Anway, the point of the story is that Novell included a pretty cool Windows file browser that would let you move files locally within your own computer or from any computer to any computer just by dragging it. And I was perplexed at first as to how they did that, because we did not have a Novell file server or anything.
But they were doing it with FTP and multiple control connections.
(Our forefathers didn't have telephones; they didn't have answering
machines; and if they weren't there you couldn't talk to them. Isn't
Yes. Indeed. That was probably the best premonition for "progress" I ever accidently said.
I vaguely recall that videotext stuff, more it makes me remember how far linux has come since those days.
And novell... I never got into (or had a reason) to play novell ,but you could just tell it was sharp. That was probably the last period in time (along with windows 3.11) people wrote really good systems.
We designed a huge, complex system.. and produced nothing useful. It was only toward the end, when it was just me and the engineers, that I started doing things bottom up, worrying about one problem at a time, testing as I went. Then things started to work. For example, initially I was working what they called the "collection" system.. the system which took data inputs from various sources (eg news feeds, weather station, train schedules etc) and fed them into the authoring system. This was a stupid, ill defined waste of time and money, because there was never enough money to build the whole huge design, and few actual examples of data sources I would actually have to interface with.. so I was toiling away on this abstract thing that couldn't be tested and would never be used. Once the (IBM trained) managers were fired, I started working at the other end of the system.. which was writing the code that talked over a modem to the DOS box at the broadcast station, which in turn interfaced to the box that inserted the data in the blanking interval of the video. So by writing something that could do that, I discovered major flaws in the code on that DOS box.. which ultimately could not be fixed in DOS but instead allowed me to get a linux box in there and rewrite everything to use sockets over a SLIP connection.. so we got that critical part of the system working.. both the broadcast end and the control end. Then worked my way backwards, writing or putting together whatever I needed as I went.. a queing system, a remote control application, a multiplexing setup (sort out what packets needed to go to what stations), a recovery system (keep track of what packets were in the queues in the broadcast stations, so that if the broadcast station computer was rebooted or even replaced, its state would be restored from my master location).. then getting it all connected back to the different content producing applications... Until we actually had something we could use to run a pilot project with a real station. Unfortunately, it was then that we found a major flaw in firmware of a mask programmed chip in the set top box.. and there was no money or time left to fix that. So it was over. But if we had worked bottom up the whole time, not wasted a year on some grand plan, the pilot project would have been started much earlier, we would have found the firmware problem sooner, we would have written a lot of stuff better. For example, the videotex screen editor that was written (at great expense) for NeXTSTEP would have probably been used rather than abandoned, if we had gotten it onto peoples desks right away instead of waiting for the completion of a gargantuan system around it. Instead, we used the prototype dos-based editor that had been used for so long and had so many tweeks that nobody would give it up. I have always preferred systems where you get the code running and see it running, and you go from there. Yes you have to have some overall plan first if you don't want a total hodgepodge.. but I think it works best when you have a basic plan at the top.. but then you start working at the botting, seeing each component working as you go...
want a total hodgepodge.. but I think it works best when you have a
basic plan at the top.. but then you start working at the botting,
seeing each component working as you go...
Yep, that's pretty much how I prefer to go.
Sometimes, I will even write simple test programs for the more complicated algorithms and such that I want to implement within the context of a server or something, just so I know that the thing will work before I even drop it into the server. That's served me so well, so many times, I never feel I'm cheating any time by doing it.
Certainly, my boss has never complained about that approach, unlike my last job, where it was considered a waste of time.