This week, I learned about some of the problems associated with global, static variables, destructors, and some of the odd consequences one experiences with these things when a process ends.
For what I need to do, I inject another process with my shared object (or DLL in Windows-speak), then the shared object hooks certain functions in the process into which it is injected so it can see how those functions get called.
Spiffy fine, works okay for those processes that do not actively pay attention to uninvited guests, but it gets weird when the process ends.
I'm caching information that I need to send, and when the process goes away, so, too, does that information. The only signal available to me that the process is ending (that I can rely upon) is the destruction of an object.
But objects can be destructed in any order when they're static... so if you rely upon one of those things to hang around while you're busy sending that cached information, you will know suffering and pain.
When you use third party libraries for some of your needs, and you don't realize that they are using such variables, your pain grows.
On the one hand, this week has been tough, trying to track this kind of thing down. On the other hand, now that I've solved it, I feel kind of like a rock star.
Yep. Same thing in java... doing too much in static initializers can lead to all sorts of unexpected pain, because dependencies aren't set up yet, because the order of class initialization is hard to grok (probably undefined, in general.)
And you're not supposed to even use destructors, like the "emergency stop - never use" lever.
Yeah, in C++, one is encouraged to use destructors as part of "RAII" (Resource Acquisition Is Initialization).
Admittedly, it is an abuse of destructors to use them for detecting that a process is ending, but my hand is forced, heh.
Right. An application should have a well managed "startup" and "shutdown" sequence, all along the object dependency chain, which happens before your final destructors are hit.
I remember a funny thing in Gnome, circa 1997-1998 or so, where there was a plugin mechanism for the panel (taskbar equivalent) which was based on linux shared objects. The shutdown sequence was set up such that it all happened from the .so unload hooks, or something like that. Basically, the shutdown handlers were running on a call stack that wanted to return back into an image that had already been unloaded. Ooooops.
Yeah, it's... it's just not the same anymore, is it?
Last time I saw him was a random Grand Central Dining Concourse run-in, maybe?
Half a lifetime ago.
Been in touch more recently.
I didn't know about the second child, but I think I knew about the new job.
Maybe. He doesn't come to the cybersecurity stuff in DC as he has in the past. Which is kind of funny, now that I'm involved in that kind of stuff.
He could always come back as Edsel. Nobody would pick that.
the map of C++ Lands:
Heh... 'Bjarne Stroustrup tower'... I can't decide if that's kind of unkind.
Trying to explain the phrase "it's not a bug it's a feature" to a musician. Them: "Oh! It's playing it wrong and calling it jazz!"
found on twitter ;-)
will one of those be able to replace c/c++?
So didn't they forget to mention Java and hasn't that already happened?
as long as the java interpreter is written in c/c++ - no, that didn't happen.
Thats something these languages try to achieve. Replace the full stack for base logic.
Java may be the lingua franca of business logic - but thats about it.