Friday, December 23, 2022

Interesting

It's somewhat wrapped in 'your special, and here's why', but this article has some interesting things to say about human psychology and our unwillingness to hear things we don't want to.

And how far we may go to punish those who persist in telling us anyway. 

It also gave me a couple more books to add to my always growing 'to read' list. 

Monday, December 19, 2022

Twitter, Free Speech, and What Elon's doing

https://www.techdirt.com/2022/12/19/i-speak-fluent-new-social-media-ceo-whos-in-over-their-head-let-me-translate-the-last-few-days-of-twitter-policy/

Saturday, December 10, 2022

Weather Comment

I don't like the cold, so it seems strange to complain...

But why is it raining? In December?!?

It's seriously nuts, though ofc everyone is so beat down about the climate change debate (just like how other issues have come to a standstill) that nobody seems to bother commenting on how crazy it is. 

Abstraction - or is it Simplification?

A few posts back I mentioned I had some thoughts on abstraction, though as I think about it I'm not sure 'abstraction' is the right term for it. 

Perhaps it's that attempts to simplify something often just add complexity in a different way. 

Like whoever it was that did that risk analysis, where they said adding safety measures (like making cars safer in accidents) just make people start taking riskier actions. 

And I feel the thoughts slipping away the more I try to explain that, so I'll just move on.

One of the things to understand about computers is that there's a LOT of abstraction going on. 

Let me start by briefly talking about numbering systems with different bases.

We use a decimal system, and we're so used to it that most people don't even think about it.

9+1=10

99+1=100

We count from 1 to 9, and when we reach the number we add a '1' to the left, rollover the 9 to 0, and start counting from there.

There are other systems that use a different base, like base 16. Base 16, or hexadecimal, requires more numbers than decimal does, because it counts a little bit longer before it rolls over.  We've pretty much decided to fill in the missing numbers by using the alphabet, which is why base 16 uses A,B,C,D,E, and F.

9+1=A in hexadecimal, A+1=B, and so on until F+1=10. (Your computer probably has a calculator function which you can set to 'programmer' mode and convert between decimal, hexadecimal, octal (base 8) and binary (base 2).

I brought that up because computers use binary or base 2. That's essential since you can use electrical circuits to represent 0 or 1 - everything is either off or on. (This may change with quantum computers, but that's a whole other level).

Every single thing your computer does is basically processing a string of 0s and 1s.

Which is really, really, really hard for people to read and understand. So we abstracted a layer away.

First with assembly language, which created simple mnemonic instructions that were easier to use than the binary. This section from the wikipedia link I shared above goes into more detail:

Assembly language

A program written in assembly language consists of a series of mnemonic processor instructions and meta-statements (known variously as declarative operations, directives, pseudo-instructions, pseudo-operations and pseudo-ops), comments and data. Assembly language instructions usually consist of an opcode mnemonic followed by an operand, which might be a list of data, arguments or parameters.[24] Some instructions may be "implied," which means the data upon which the instruction operates is implicitly defined by the instruction itself—such an instruction does not take an operand. The resulting statement is translated by an assembler into machine language instructions that can be loaded into memory and executed.

For example, the instruction below tells an x86/IA-32 processor to move an immediate 8-bit value into a register. The binary code for this instruction is 10110 followed by a 3-bit identifier for which register to use. The identifier for the AL register is 000, so the following machine code loads the AL register with the data 01100001.[24]

10110000 01100001

This binary computer code can be made more human-readable by expressing it in hexadecimal as follows.

B0 61

Here, B0 means 'Move a copy of the following value into AL, and 61 is a hexadecimal representation of the value 01100001, which is 97 in decimal. Assembly language for the 8086 family provides the mnemonic MOV (an abbreviation of move) for instructions such as this, so the machine code above can be written as follows in assembly language, complete with an explanatory comment if required, after the semicolon. This is much easier to read and to remember.

MOV AL, 61h       ; Load AL with 97 decimal (61 hex)

That was a lot, and I didn't want to cut out the context so I highlighted the parts I wanted to focus on the most.

First, you see the binary data that the computer is processing. Since everything is binary, the computer mostly knows from where it's processing a program which sequence of 0s and 1s are instructions and which are the values the instructions are acting upon. 

That's what it's talking about with the opcode (operation code for doing something) versus the value. So we have:

10110000 01100001 

The first set of numbers are the instructions to move the following value to a specific place in the computer, and the second set of numbers are the value to be moved.

And then we made it slightly more human readable by converting the binary to hexadecimal.

10110000 = B0; 01100001 = 61

For the specific computer architecture in the wikipedia example, B0 becomes a mnemonic in assembly language: MOV AL. 

Basically says the computer should move the subsequent value to the location 'AL'. That value is 61 in hexadecimal, 97 in decimal. Hence the assembly language command:

MOV AL, 61h 

That's still not very easy to understand, is it?

So we abstract again, and created higher programming languages. C, C++, Python, Java, COBOL, etc.

Each have their nuances (some are better for science, since the accuracy needed and extreme sizes of numbers used in their calculations are not something every programming language handles well), but the core logic is fairly similar so once you've learned logic you mostly just have to learn the specific syntax used in each language.

And they're still fairly complicated to use. Especially since each tends to develop 'libraries', i.e. reusable code created for common tasks... and since these are common tasks, and programmers don't like reinventing the wheel, you then have to know which ones you want to use and import them into your program. (When I'm programming and need to know how to do something, half the time it's just a matter of importing the correct library and then giving the commands needed to use that library.)

Before continuing I'll add this - even though most people use the more human readable languages, that doesn't mean that nobody has to know assembly anymore. Actually, malware reverse engineers tend to look at that level to see what a malicious program is doing. And the hackers themselves may be working at this level in order to takeover a program....

Basically, if the computer is expecting a sequence of zeroes to be instructions and another sequence of zeroes to be the values it acts on, if you understand the computer architecture well enough you can sometimes overwrite the original sequence of zeroes. Instead of doing the command above, MOV AL... they may overwrite the binary to something like JMP, which will jump to another location and start reading the code there. If the hacker is able to force a JMP to their own code, then suddenly the program will start following the instructions written there.

To get back to the things I've noticed - we're dealing with some very complex things, so complex I sometimes marvel that at the heart it's all 0's and 1's, and everyone seems to be looking for ways to simplify or make things easier.

Except most of the time those 'simple' things are fairly complex in an entirely different way.

For example, when I started programming in Java our professor wanted us to use BlueJ as the development environment. There are others out there that do a lot more for you, like predictive text. She wanted us to start with something more basic so that we didn't learn to rely too much on that sort of thing. (I've noticed that when I type something out fully I tend to remember it better than when it's just a matter of typing 'ma + TAB' to select whatever is being suggested, so I think she had a decent point here).

When I took some of the later courses I decided that I needed to start using a more powerful tool. So I went looking around online, checked out the various recommendations, and decided to use IntelliJ.

It was so complicated and difficult to learn that I wondered if I'd chosen wrong, uninstalled it and installed one of the alternatives... found it just as complicated, uninstalled it and went back to IntelliJ. 

And then dug into the documentation, any videos or tutorials online, and figured out enough of it to do what I needed to do.

I still don't think I've mastered it, by any means, but it's... usable. Well, at least they also make an integrated development environment (IDE) for python so it's not like I have to relearn everything. (And I'm not really coding that much in the first place.)

Oh, and get this - it's not like we ever had any classes teaching us how to install and work with an IDE. Maybe when we started using BlueJ at the beginning, but choosing another tool was entirely up to us.

We come up with these really complicated tools to 'simplify' something, and to a certain extent they do. But they also add another layer of complexity. First, because you have to learn the tools themselves. Second, because sometimes that layer obfuscates the underlying issues, and you still have to dig deeper to figure things out. And third, because that 'simplification' leaves room for other complex actions.

I probably need to explain that in more detail, but I'm going to stop there for now. My dog needs some attention.



Wednesday, December 7, 2022

Drone Hive 2

If I'm talking about air and sea drones... Probably ought to think of land drones too.

Or more properly, diggers. 


Oooooooo - sappers! 

Drone Hive

I was thinking a bit more about drone swarms.

And really, if each drone has some sort of weapon (and it doesn't even have to be an explosive), it really could be like a swarm of bees... 

Or, in water, a school of piranhas. 

Which means a couple of things... 

Like, if they're a warm of bees or hornets then to defend you might want to take lessons from bears. 

Alternatively, have your own drone swarm... In which case it'd be like an air battle? Except you probably have to worry more about the ooda loop - ie which tech + controller can decide and act fastest (and observe and orient too ofc, to get the OO in OODA).

Though the controllers are probably the weak point... If remote controlled there's a signal that can be jammed. Or intercepted.

Though if you program the swarm to monitor each other perhaps they could act even if jammed?

Hmmm.

Well, I'm sure there's other uses I'm not thinking of. Not just in fighting, but for supply runs and intel collection. 

Tech and War

Anyone in the military knows that tech changes tactics...

Its still hard to predict how though. Or rather, what new tactics and countering tactics are best. 

Its part of why I'm curious about drone usage in Afghanistan, because we've had drones for a while... But I think we've mostly used them more like unmanned planes. 

Anyways, that's why I read this article, and suddenly I'm reminded of drone shows like this (and I really wish I had the embed option in the mobile app here, so even if you don't click the link you can get a sense of what I'm talking about).

Anyways, imagine a swarm of drones like the ones shown here - except that giant turtle or dragon is moving all around an aircraft carrier. And shooting or dropping bombs or even just bashing themselves into vulnerable locations.

Oh, and good luck trying to get your planes to take off. 

Monday, December 5, 2022

You Know...

My previous post reminded me:

If you want the flexibility to be able to hire and fire people easily, maybe you ought to reconsider supporting the social safety net.

All those negative consequences for lay-off? A lot of that is because of how stressful it is to job hunt. 'people feel bad when bad things happen to people they know', and losing your job - especially when most people live paycheck to paycheck, healthcare is tied to employment, etc - is probably one of the top five worst things to happen to a person.

But, you know, realizing that investing in a social safety net could help a business is just not the sort of thing we see considered.

Just like businesses continue to do mass layoffs about as painfully as possible, despite the research. 

Bodies, Not Refrigerators

I heard once, about a king who convinced his people to use potatoes - by planting them in a field and posting guards, with the expectation that people would decide whatever they were guarding was valuable and sneak in to steal them.

I don't know if that's true or not (a quick internet search says the king was probably Frederick the Great, but that there's a debate over whether this is true or not) but I thought this said something interesting about what it takes to manage people.

I suppose I ought to clarify first - I'm not sure it's a great idea, mostly because it fits some of the concerns we have about 'nudging' people into doing what we think is the right thing. I'm torn between the acceptance that this is part of human nature, and yes it would be nice to make sure - for example - that people save for retirement... but it also smacks of paternalism, of deciding what's best for people without their input, and pushing them to make what you think is the 'right' choice. (Yes, if you make it 'opt out' rather than 'opt in' people who generally just stick with the default option will save for retirement, and anyone who feels strongly about it can always opt out... but is that really the right way of handling it?)

That said, it's an interesting illustration of how managing people is not at all like ordering a fleet of robots.

I was reminded of this because of Elon Musk's Twitter takeover. 

Actually, it was because I got into a discussion with someone about the mass layoffs, and mentioned my understanding that layoffs are actually pretty consistently a terrible thing for businesses. 

I've stumbled across studies to that effect off and on for years (though it's apparently not inevitable, as one article was all about how a company avoided most of those pitfalls), but if you think about human nature it's pretty easy to understand why.

When you're on a team, and someone gets fired, generally the people remaining feel scared... next time it might be them. 

They also tend not to trust their company as much. Layoffs are a stark reminder that companies think you're replaceable, are not loyal to you, and will fire you in a heartbeat.

Why put in a lot of hard work for a company that doesn't care for you? Why bust your butt if you might lost your job in another month?

Plus, while there may be a few egotistical maniacs who think they're somehow the elite 10% and that everyone else is dead weight (the social media discussion in question was partly because of a thread going around about 'whaling and culling') most people become friendly with their coworkers. 

Maybe not bosom buddies, and you may hate some of them, but in general the people you spend most of the week with are people in your circle. And people feel bad when bad things happen to those in their circle. (The company that avoided layoff pitfalls did so in part by giving their employees a chance to decide just who got laid off. Basically said 'we need to reduce by 10%, decide who those 10% are'. And even though decisions were made based on things like 'who has a kid and can't handle job hunting at this time', the people who remained felt respected and cared for by the company in a way that most companies completely fail to do).

This brought me to something my T'ai Chi instructor said, back when I was in college and trying out new things. He was talking about physical movement and not management, of course, but he would say we were 'bodies not refrigerators'.

By which he meant that people are almost always in motion. We're constantly shifting our balance - from one foot to another, forward and back. We walk basically by constantly falling and catching ourselves. We're not refrigerators, that just stand where they're put. We're constantly in motion, even if it's only slightly. (And in fact, we can get injured if we don't move. Like bed sores.)

I think this distinction is important on a different level - we're people, not robots.

We have good days and bad days. We get absent-minded and forgetful if stressed out about something else (that's part of why the military works so hard to take care of families back home. A soldier will have a hard time focusing on his work if they're worried about whatever is going on back there - which might include things like patrolling, sighting an enemy, or you know identifying a bomb before your team triggers it. Seriously, being aware when you're emotionally compromised is a skill well worth developing).

We don't actually do well with cramming or powering through. I noticed this especially since learning how to code, because there are times where forcing myself to step away from a project was the best decision I could make. Seriously. Sometimes stepping away will allow me to come back with a fresh perspective, and finally work through whatever it was that was causing trouble. Or sleep - the number of times I figured out what I needed to do just as I was falling asleep is ridiculous. (Another little anecdote floating around on social media somewhere - a developer commented about how he's paid even when he's at the office goofing off playing a game, but not paid when he's in the shower and suddenly figures out how to code something-or-other for work).

People are fascinating. They can be completely selfish and irresponsible, and also extremely giving and sensible. They can be petty and spiteful, gracious and kind... and motivated for all sorts of reasons. (It's part of why I love humanity, even while I'm also sometimes frustrated and annoyed with it).

We're glorious as we are - and we're not robots. We can't be managed like robots, and nobody should even be trying to.

There are all sorts of managerial tools, and they have their time and place... KPIs (key performance indicators), expected rates/hr, quality and delivery goals - all sorts of data that managers use to tell how they're business is doing.

But you have to know how and when to use them, and when to take them with a grain of salt. It's sort of like what my father said about predicting the weather (he has a master's in meteorology courtesy of the Air Force). He said you get all these measurements - temperature, barometric pressure, etc - and the computer makes a prediction...

And then you have to go outside. Look at the weather and see if it fits what's being predicted. Check the clouds (I don't really know the names for which ones, but some types of clouds indicate certain types of weather and if the computer is predicting one thing and the clouds don't look right for it, you may need to revise. Don't quote me on this, I'm not the one who actually studies these things.)

Businesses need data to track their performance, but a) we can only track the things we measure and b) people will game the system.

I suppose this is also like the relationship between qualitative and quantitative analysis, but I think this post has gone on long enough.

The main point was that we're people, not computers. And I see far too many managers who get so focused on data and metrics that they seem to forget that.

Sunday, December 4, 2022

If Nobody Followed A Tyrant...

Trump's statement about the Constitution (and the lack of response by Republicans to something said by a candidate for the 2024 presidential nomination) reminds me of something I thought back in 2020.

Trump called for states to 'stop the count' of votes during the 2020 election, and most people either never heard it in the first place - or shrugged and ignored him.

I've talked before about the complicated relationship between leaders and led, how it's almost like a magic trick. Trump sort of proves the point, in that he continues to say outrageous things... And mostly gets ignored. Or dismissed as political theater. 

On the one hand, it's disturbing to see people ignore the very dangerous things he says. Just like they ignore Jan 6. I used to think we had a shared understanding, of things like the social contract, democracy, the Declaration of Independence that said 'all men are created equal'... And apparently that's not true. There are powerful people who lack any understanding whatsoever of the lessons learned that made America the nation it is today.

On the other hand... If it weren't for the followers still (for whatever foolish reason) willing to support his insanity, it'd be funny. The way he's just... Ignored.

Has to drive that megalomaniac absolutely nuts.

It's a little like if Putin declared that everyone had to make a video with the letter Z, and everyone around him just sort of looked at each other uncomfortably and ignored him.

Its hard to be a tyrant if nobody follows your orders. 

Thursday, December 1, 2022

Penny Wise, Pound Foolish

https://nymag.com/intelligencer/2022/11/rail-strike-why-the-railroads-wont-give-in-on-paid-leave-psr-precision-scheduled-railroading.html