Organization man(Losers,Sociopaths,Clueless)

I have been reading the series of dissecting “The Office” by venkatesh rao at the ribbonfarm. And watching the actual TV serial series in a mad obsession.
Some, opinions on the decision making behind the story-line writing. For ex:
from the branch closing episode to the next. I would call there was a
experiment and changed the next episode storyline based on the TRP rating.
But seriously, i was getting sick of Michael was preferring the stanford branch
versions.
And as is natural with any psychology parallels/articles characterized or defined only with words, found myself comparing with the articles. It keeps running through my head.
The more i ithink about work and what i should get done next, the more i realize, i have already become the checked-out loser. Darn it….

Ok, he has a sub-division within the loser list.
The staff-loser and the line-loser.
The definition being that the staff-loser’s function is to be the priest of the hierarchy while the line-loser’s function are closer to revenue-generation.
Now, because of the HIWTYL policy all human beings learn to employ automatically in any social situation you can expect the organization’s bureaucracy to be as heavily riddled with defensive policies as possible.

My instinctual shying away from Team Lead/Project Lead position signal that i have a reflexive aversion towards becoming a staff-loser. I don’t think they are useless, in the unfortunate real world we all live in, they are a necessity, if only to save businesses from oppurtunistic humans. It’s only probably in a Ayn-Randistic Ideal world they can be avoided altogether.

Well, the next thing to do is unlearn the habits of checked-outness and learn
the habits of the socio-path.

Rule 1: Bayesian decision making. estimate potential risk and potential rewards
in any career move(i.e: 8-9 hrs a day of what you do in office.

Rule 2: To quote Venkatesh Rao from his ribbon farm post.” The risk-management work of an organization can be divided into two parts: the unpredictable part that is the responsibility of the line hierarchy, and the predictable, repetitive part that is the responsibility of the staff hierarchy.”
So this means, when in doubt about which is good for your career take the unpredictable option.In your personal family system (IFS)

Rule 3: To quote him again “Bureaucracies are structures designed to do certain things very efficiently and competently: those that are by default in the best interests of the Sociopaths.

They are also designed to do certain things incompetently: those expensive things that the organization is expected to do, but would cut into Sociopath profits if actually done right.And finally, they are designed to obstruct, delay and generally kill things that might hurt the interests of the Sociopaths.” Take the priority of your decisions not from the bureaucracy and the rules/system it makes, but from what you want to achieve/get done.
Oh and while doing that, keep in mind that bureaucracies are hardly ever set in stone and do change, perhaps a thixotropic fluid is a good analogy.i.e: They have high viscousity(read resistance to change), while under normal conditions, but are less viscous when stressed. I might even suggest this is the key behind Jack Welch’s successful turnaround of GE*. One could argue that he personally created the stress required to get the GE Bureaucracy to adopt changes required to earn profits.

Rule 4: Recognize your habit patterns in taking sides within the social groups. Make sure you use ambiguity by deliberate choice(calculating the effect of it on your social status) rather than by reflex habits(read heuristics) shaped by past experiences.

*– While i don’t claim to have a lot of knowledge about the history of GE or of Jack Welch’s term at it, I have read his book “Straight from the Gut” and think i have some guess at his approach/ideas in decision making.

Make lessons

I was messing around with opencv library for C++ and it’s interface.
Very soon, after I began typing out code from web pages and compiling, I got tired of running the compilation command manually from the terminal.

Not to mention, since I had built opencv from source, I had to explicitly pass -I folder.

Soon, I was wishing I can just run make and get the code to run.

Well, instead of that I figured out a way to run make and just compile and create all object files in the current folder.

Here’s the MakeFile I ended up using:

PROGRAMS_SRC=$(wildcard *.c)
TARGETS=$(patsubst %.c, %,$(PROGRAMS_SRC))
CPPFLAGS=I/usr/local/include/opencv2

all:
$(foreach var,$(PROGRAMS_SRC),g++ $(var) -o $(patsubst %.c, %,$(var)) -$(CPPFLAGS); )

The two key things I learnt were patsubst and foreach. I had seen patsubst and wildcard, while running through LCTHW before.
As the name implies, wildcard keyword is a regex to match a bunch of files.
patsubst is again regex, but this time to replace a set of text.

foreach is simply a way to iterate over a list of values in a variable. I just end up running the g++ command for each of the file.

Note: Am using .c extension here, though i should really be naming the files .cpp and using that in this makefile.
Note2: This assumes all of your code is confined to one file(except for library imports). Otherwise this just won’t work.

Common startup mistakes

Dear Indian Startups with scaling problems:

1.Excuse is not a solution. (‘Startups are chaotic’, is an excuse, and does not solve any problems.)
2.Funding != Revenue
3.Survival !==> Growth

4.Employees get the short stick or draw the short straw in most startups.(i.e:Limited max benefits*, and quite an amount of loss(more than monetary, time, energy etc.)), So try to find other ways to compensate.
5.The employment contract is like postpaid. The employee comes in to work and gets things done over a month and gets paid after.
(which is another downside(to employee,but upside to employer) of the standard employment + payment structure. ).
Keep that in mind when you form policies to stop payment on the employee giving notice etc, perhaps more importantly before sending an employment offer,pick a temporary offer if you are not comfortable.(aka, fucking think about all implications of your choices when making policies that affect all the company)
In other words, your internal policies are what make the tradeoff between customers vs employees, sales vs engineering etc… Make sure you consider them(tradeoffs) carefully and they align with your goals.

6.If you want to manage your programmer like a Machine Shop(i.e: blindly borrowing management principles from the assembly line) worker, then don’t be surprised when they act like one.
7.If you say “I don’t think “, when you really mean “I can’t imagine why “, you are miscommunicating(aka perjury in legal parlance).(that can be a decision, heuristic, idea, etc.)

8.There are certain assumptions a software engineer makes. Like,
a,the office space is quiet enough here one’s self think.
b,The colleagues understanding and knowledge is within a reasonable difference(> or <) between the self.

9. Shifting the burden of proof/blame not equal to open discussion/culture.
* — Am assuming no stock options.

Honesty as a value proposition

Honesty as a value proposition. — VGR writes here about honesty as a meaningful value addition
(on its own,i.e: considering honesty as a separate entity in the hiring decision is meaningful) to the modern corporate environment.
Ofcourse, most of his suggestions are based on executive management positions.

Anyway, my experience, suggest that treating honesty as a single dimensional value is not enough.
Honesty under uncertainty vs Honesty under certainty form two very different beasts.

I believe it’s easier to be Honest under certainty. I think when there’s uncertainty,
people tend to form stories(scripts as vgr would call it), and everything they say, do, report etc. are built around it.

There’s a question about intentionality of this specific type of dishonesty, but I’ll leave that to philosophers.
I’ll just observe that the forms of this dishonesty, map very well with another VGR work on his mailing list “Be Slightly Evil” deception.
Now, of itself it doesn’t exactly mean lying, but it doesn’t exactly mean you can take what is reported to you at face value either.

Either way, the point I was making is honesty about probabilities and their expected values, is very very hard to ascertain in interviews.
But they have an inherent value that’s near impossible to quantify or enumerate.

Honesty under uncertainty is perhaps the hardest to practice. When you don’t have a revenue stream that meets your expenses,
but are hoping to meet the expenses in the future, you have a strong incentive to believe you have a very good chance of achieving that.
Not to mention the set of cognitive biases, human beings are so very vulnerable to.

None of us are perfect Bayesian agents.
The biggest challenge in startups I have seen so far is communication of aprior probabilities and the dependent probabilities.

Another area this honesty about uncertainty comes into play is investment advising, people management,HR management,academic research,etc..

In my experience these are the areas where a whole lot of signalling and powerplays link goes on.

In investment with NNT seems to have started some work in exposing some of the dishonesty.
Since I have no experience working with that field, I won’t comment on what effect it has had,
but will point out that it was one of the areas, which used to have flashy,after-work life, and a very heavy, formal signalling at work life.

Another perhaps side-effect of this Honesty about uncertainty is under-confidence.The converse definitely seems true.
Lack of honesty under uncertainty, seems to have a high correlation with over-confidence.

The trouble with looking for honesty and treating it as an economic proposition is that, it rarely works one way to be honest between two agents…
And very typically, in business uncertainty is treated with dishonesty(A strong positive bias, driven by positive signalling needs), and people who have made honesty a habit can’t stand it.
Infact some habituation or set of causes lead some eople to demand and expect a honesty, to the level of trying to quantify probability values to personal uncertainities and use it to compare decisions for inconsistency.

To use the Gervais Principle terms, Losers avoid/deny uncertainities. They prefer to turn a blind-eye to uncertainities. They prefer to escape from the uncertain parts of reality.

The Clueless, run bravely against uncertainty. In the presence of uncertainty, they activate their super-hero fantasies and try to run against it and fight.

The Sociopaths, realise they have limits on what they can do against uncertainty, but figure out how they can use it and the predictable behaviours of people around it to get what they want.

Epiphany moments

A set of epiphanies

  1. Thinking about functions, while writing an application is as likely to lead to overuse of functions as overuse of objects. it’s all in the thinking habit more than in the paradigm(OOPS vs functional programming)

  2. Expectation management: is just a very crude, buzz word for referring to the practice of quoting/predicting/setting schedules/targeted time when others should pay attention to check/verify/test the result(i was tempted to say when others should expect, but yay, managed to avoid it).

  3. I definitely love attention of other people. Though it’s clear it’s not as simple as just liking it all the time. It seems there’s a temporal phase component(sine wave??nah.. am being too wishful.. prob. more complex, but good heuristic to start with). I wonder if there’s some number component tooo. i.e number of people who are paying me attention. past experience suggests that a big number > 50-100 is unnerving. Am convinced am comfortable at a number closer to 50. Though, i think that measure has another confound factor of strangers vs people i already know or have spent considerable time with

  4. When i hear about people not having official documents like DL, passport,etc.. my bias or sense of respect for them changes negatively it seems. though am uncomfortable with calling it sense of respect.

  5. Reading this post from Rushabh Mehta about going to or not going into sales <a href="http://erpnext.com/to-be-or-not.html"&gt; </a> Made me realise a lot of things. 1. Am not a product guy. because i can’t maintain that sustained focus irrespective of the rate/speed of feedback. 2. Am impatient and want quick/instant feedback. Maybe i can be the product guy for some type of apps, where there’s a very short feedback cycle, but for the most part i just am not that guy. 3. Not to mention the limitations of my own rationality. Am now very convinced of a career change.

  6. My auditory attention is untrained/under-trained. That’s basically why i have so much trouble in verbal communication clarity. i have to put deliberate attentional capacity to speech and listening that i don’t have any left to see the gaps in what i hear.

  7. With the request to change REST Api back to the standard http querystring parameter now i understand. It’s a waste of effort. But it’s ok because am paid a lot less than that .Net developer who has to learn json to consume my API. Darn it. that’s the end.

  8. I am beginning to realize, that the time to

  9. Looking for ahava* via the scientific method(<a href="https://en.wikipedia.org/wiki/Nullhypothesis">NHST</a>) might be craziest thing you ever do and is likely to render you peaceless and lonely for the rest of your lifetime…. But it does lead to an interesting life. *– Hebrew for love

  10. A good heart may be a necessary condition for a heroic/ethical/moral life, but definitely not sufficient. Infact, para-phrasing Million Dollar Baby show me a hero who’s all but heart, and i’ll show you a shooting star that’s gonna get 15 minutes of fame and then fade into nothingness.

  11. Life’s 99

  12. Before asking a question, think what you will do to find out the answer. In most cases, you’ll find that you don’t even care about the answer and asking the question is rather moot/pointless.

  13. the more specialized your role/expertise/area becomes the more your job becomes that of convincing a group of people to follow something or some way. The more i think or write about s/w architecture, the more i think it is about convincing people. than getting something done asap.

  14. the beauty of dynamic typing is also it’s bane. i.e: Not worrying about type conversions, can easily blow-up into badly designed code base, especially when you’re part-timing into a project.

  15. also just realized that vim addiction’s side effect. copy, paste without functionality understanding.

  16. Writing and building a s/w application from scratch is exactly like writing/proving a rigourous theorem.that’s why i got into it. EWD was right. but it is also a little masochistic and in the words of Hofstadter, yeah, the rest can be left to the rigourous mathematicians. that’s why i want to get out of creating software. i didn’t realize it before because, i never really created software at IBM. it is this epiphany that 3 years of working at IBM has cost me. and it is this epiphany i find the costliest of all prices i have paid in making that decision.

  17. Beginnig to get annoyed at vim at dealing with text and doing it great, but being crude like using a butcher’s knife at a neuro-surgery table. i really need to start using some other ide or consider emacs.

  18. Realize why function composition is such a good idea and very useful. in big projects with humongous code bases, chaining multiple functions is a necessity. Using a language that doesn’t design for it is painful for the same reason. And it’s for the same reason java and .Net shops are so big and hierarchical. And more importantly, slow or rather programmers who work in those languages tend to become lazy and have poor understanding of how the computer works.

  19. The browser is not a pure functional object. it provides lot of semantic ambiguities that are grounded in being designed for endusers, most of whom don’t know math.

  20. Very clearly, i have an overactive/ am over-dependant on ACC. it needs to be actively suppressing something pretty much most of the time. i am beginning understand the source of the bipolar lisp programmer internet meme and it’s appeal to so many other programmers.

  21. So does that part that is responsible for simulation(of any kind) of real-life experiences..

  22. Error handling in python is a pain if you set the value to None and check it everywhere. and a pain if you use try catch. in the first case you have to go catch the error at every function that calls a given function. in the second if there’s a failure, you don’t really get the stack trace of where it occurred only where the exception was raised/caught. Damn.. i can almost see why the either approach is haskell is better for the programmers. first thought is it works because of the type system. but would like to know more details than that.

  23. Another way to put venkatesh rao’s HIWTYL’s strategy is that people automatically assign higher malintention to errors of commission than to errors of omission.

  24. the more the number of states involved in your application the more the code base size in moving them around. The more the number of states in your application’s db design, the more configurable it is and that’s the core of the trade-off.

  25. creating separate python files to separate two different set of functions at the start of an application is a crazy idea(because it’s one of those naming issues, that’s neither here nor there and is only likely to confuse you in delivering the completed functionality).. You’re better off writing out all the functions required for the application functionality and then splitting them and renaming the files as necessary.

  26. Just when i was beginning to berate python’s philosohpy of reabability for not providing the ability to subtract two lists, i realize, i can use sets for the same purpose and there is a possible semantic confusion avoided by that policy. hmm… i think i understand the benevolent dictators attitude

  27. Am now tired of the multiple layers i have created for vip migration project. it doesn’t seem to make any sense to have so many layers.

  28. Problem with imperative programs?? One misplaced return function can waste half a day of a programmer’s effort, because it doesn’t throw any meaningful error/raise exception, but just returns.

  29. Copied music files to that dmguest user’s folder and realized, that i would like a progress bar like see on windows. then realized it’s not on linux, because the coding up something like that would be inaccurate,approximations and some assumption making guesses. Not sure why it’s not available as a cmd line?. thunar file manager has it and am sure other File managers have it too..

  30. Also am beginning to realize that i am probably not someone who can program(John Cook’s blog link: someone who can write large programs, with large probability of being correct). but more of a computer scientist in that context.

  31. Also perhaps, not for the first time, i realize i have been pathetic at quantitative analysis in my life’s past decisions. though i have been fairly good(read slightly above average/median) at qualitative analysis. Remember thinking i am good at managing globally but horrible at managing locally. Remember thinknig am good with global targeting?. Remember 9-10 years ago, deciding, i don’t want to go into a work career in my base degree, mechanical engineering? Remember regretting that even3-4 years ago, hell even now there’s a little wonder in how life would have turned out in that case. Anyway, the truth is it was a good decision in terms of how the market has panned out. i.e: the profit margins growing lower and lower on the ront of manufacturing.

  32. I want a programming language/framework/toolset, that lets me ignore the hacked up together technological mess that is the web,lets me focus on the problems and the math solutions to it, but doesn’t restrict my problem space. But guess what, there isn’t a reasonable compromise. Yesod + haskell seems promising so far, but not sure how it’s going to stand the changes over the next decade or so.

  33. ORM is a big fuck-up because it is a leaky abstraction. but more importantly, it’s leaks are not evident till you’ve invested long enough time in it, to waste. infact that’s pretty much the problem with learning new abstractions. you just can’t be careful enough with which ones you learn.

  34. You know, you’ve fucked up on organizing the code functions across files, when you want to find a function definition and are not sure, where it is. worse, when you start fearing having to search for a function name and not finding in the current file.

  35. ORM is a double jeopardy, actually, because it lets you pretend your tables are objects, which they aren’t. And then forces you to use the object oriented syntax for dealing with them. Duh….

  36. Infact the other jeopardy about object-oriented programming is that it forces you to think in terms of objects/data structures that can do stuff. So you when building an application you tend to think of objects/data structures as core and functions as support structures. it’s reverse. most of the time you need to figure out the functionality as you go. it means quick turn-around time. w

  37. Just had a simple trial run of geany. it’s cool i love the symbols bar on the side.. seems useful. pros: symbol table on the side bar cons: it opens a new terminal window on running, would rather use something plugged in the geany window itself.

  38. I name my functions trying to match what they do and find it a useful indicator that i have to break down the function into smaller ones. too long names are signs that am cramming too many things together.

  39. There’s a enchantment in listening to music in a language that you don’t understand. it’s a taste of the mystery and/or mysticness. listening to japanese music now.

  40. I can use the typeracer game as a test of how co-ordinated my visual attention and typing skills are and whether that leaves any attention level free for logical reasoning. that would be a good test to se e if i should do server admin stuff or development or just go home or just type out journal stuff.

  41. One more epiphany, today when raj asked whether i know C. ofcourse i learnt. i never used it professionally. but been playing around on and off on open-source C projects. and have found the linux C- development chain painful. but have a bad confidence issue that heavily undercuts my confidence level in C. partly left-over from living with brother and his ideas on how C user should be .

  42. Working only with python for a long time is likely to make you pick up bad thinknig habits. like preferring cute, short, readable code over clear, well-commented, efficient code. Am getting the hell out of python for some time now.

Sense of agency vs Hierarchical control

Levels of Hierarchy:
Two levels are studied in this experiment.
1.Perceptual-motor Level
2.Goal Level

* – Sense of Agency
**– Expected from previous trials and practice trials.

Inferences from previous Research:
Measures of agency:
All of these should be validated/verified against measure theory principles.
2 Types:
Implicit — Intentional binding is most associated(aka correlated??) with measures like
efference,sensory feedback,causal feedback and intentionality(??).
Explicit — explicit rating of authorship

1.Intentional binding —
a,self reported temporal closeness between action(in this case,self-generated,shooting at some target)
and its’ feedback(in this case a circle flashed at the aimed location).
b,It has been shown motor cortex stimulation induced movement doesn’t affect intentional binding.
(Might affect in long-term meditators, perhaps??)
c, Stronger when the subject believes that he has control over the environment.
(People managers/Executives etc. wonder about the implications?:-))

Experiment Design:
Base Paradigm:
Subject Task(Explicit goal):
1.Aim at shoot at a target in a noisy set of visual stimuli.
2.
Variables:
1.Amount of noise
2.Interval between the trigger press and target stimuli appearance.
3.Estimation of the interval between trigger press and target appearance.
4.Self-reported control
5.

2.SOA* — manipulating/changing the time lapse between subject/user action and expected** response

Basic hypothesis/argument:
The concept of control in perceptual-motor(action) event loop provide a basic framework to understand explicit and implicit sense of agency.

Papers to read:
1.Event-control framework for sense of agency (Jordan, 2003;)

Statistical Results:
I don’t qualify to judge whether their choice of ANOVA is right or not. And similarly I don’t try to make sense of the stat results and interpret them as i never learnt beyond first order statistics. Am working on it though, so later.
Conclusion:
Overall, I learnt a lot from the background, theory, and summary of some of the references.
They ware all new and interesting to me. But i was left hoping i had picked a paper which had concrete(negative or positive) results on a specific(tight?) hypothesis.

Deep C — activation frames/ stack frames

Ok, this ‘Deep C programming’ — Slide talk suggests that(slide 136) activation frames and execution stack are two very different concepts in the executable created by compiling and linking a C program.

I was curious and looked up activation frames in C, which led me to here. Ofcourse, before i began this journey, I only had heard of stack memory used by a program, and had an understanding that it is one part of the initial memory allocated to the process by the OS, that is used as a stack fashion.(i.e LIFO).

The university of calgary link mixes the terms activation,stack, frame, and execution rather freely. Something’s up. Though first let me try out the example in that Univ. of Calgary link and see what happens.

Here’s the code:, compilation and running it.

#include 

void print_facts(int num1, int num2);
int max_of_two(int j, int k);
double avg_of_two(int c, int d);

int main(void)
{
  int i;
  int j;
  /* point  1 */
  i = -8;
  j = 7;
  /* point  2 */
  print_facts(i, j);
  /* point 10 */
  return 0;
}

void print_facts(int num1, int num2)
{
  int larger;
  double the_avg;
  /* point  3 */
  larger = max_of_two(num1, num2);
  /* point  6 */
  the_avg = avg_of_two(num1, num2);
  /* point  9 */
  printf("For the two integers %d and %d,n", num1, num2);
  printf("the larger is %d and the average is %g.n",
         larger, the_avg);
}

int max_of_two(int j, int k)
{
  /* point  4 */
  if (j < k)
    j = k;
  /* point  5 */
  return j;
}

double avg_of_two(int c, int d)
{
  double sum;
  /* point  7 */
  sum = c + d;
  /* point  8 */
  return (c + d) / 2.0;
}

Compile(make), and run executable:

anand@anand-usb-boot:C [master] $ make activation_record_demo
gcc --std=c99 -g -pedantic -Wall -lm    activation_record_demo.c   -o activation_record_demo
activation_record_demo.c: In function ‘avg_of_two’:
activation_record_demo.c:50:12: warning: variable ‘sum’ set but not used [-Wunused-but-set-variable]
anand@anand-usb-boot:C [master] $ ./activation_record_demo 
For the tow integers -8 and 7, 
the larger is 7 and the average is -0.5.

Now comes the fun part: We need to be able to examine the stack, during the execution of the program.
GDB to the rescue.

Here’s the gdb session i went through:

anand@anand-usb-boot:C [master] $ gdb64 ./activation_record_demo
GNU gdb (GDB) 7.5.91.20130417-cvs-ubuntu
Copyright (C) 2013 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later 
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.  Type "show copying"
and "show warranty" for details.
This GDB was configured as "x86_64-linux-gnu".
For bug reporting instructions, please see:
...
Reading symbols from /home/anand/workspace/github_stuff_public/Miscellaneous/C/activation_record_demo...done.
(gdb) b 11
Breakpoint 1 at 0x400534: file activation_record_demo.c, line 11.
(gdb) run
Starting program: /home/anand/workspace/github_stuff_public/Miscellaneous/C/activation_record_demo 

Breakpoint 1, main () at activation_record_demo.c:12
12	    i = -8;
(gdb) frame
#0  main () at activation_record_demo.c:12
12	    i = -8;
(gdb) n
13	    j = 7;
(gdb) n
16	    print_facts(i,j);
(gdb) frame
#0  main () at activation_record_demo.c:16
16	    print_facts(i,j);
(gdb) s
print_facts (num1=-8, num2=7) at activation_record_demo.c:28
28	    larger = max_of_two(num1,num2);
(gdb) frame
#0  print_facts (num1=-8, num2=7) at activation_record_demo.c:28
28	    larger = max_of_two(num1,num2);
(gdb) s
max_of_two (j=-8, k=7) at activation_record_demo.c:42
42	    if (j < k)
(gdb) frame
#0  max_of_two (j=-8, k=7) at activation_record_demo.c:42
42	    if (j < k)
(gdb) frame 0
#0  max_of_two (j=-8, k=7) at activation_record_demo.c:42
42	    if (j < k)
(gdb) frame 
#0  max_of_two (j=-8, k=7) at activation_record_demo.c:42
42	    if (j < k)
(gdb) down
Bottom (innermost) frame selected; you cannot go down.
(gdb) s
43	        j = k;
(gdb) s
45	    return j;
(gdb) s
46	}
(gdb) s
print_facts (num1=-8, num2=7) at activation_record_demo.c:31
31	    the_avg = avg_of_two(num1,num2);
(gdb) s
avg_of_two (c=-8, d=7) at activation_record_demo.c:52
52	    sum = c + d;
(gdb) s
54	    return (c+d) / 2.0;
(gdb) frame
#0  avg_of_two (c=-8, d=7) at activation_record_demo.c:54
54	    return (c+d) / 2.0;
(gdb) s
55	}
(gdb) s
print_facts (num1=-8, num2=7) at activation_record_demo.c:34
34	    printf("For the tow integers %d and %d, n",num1,num2);
(gdb) n
For the tow integers -8 and 7, 
35	    printf("the larger is %d and the average is %g. n",larger,the_avg);
(gdb) n
the larger is 7 and the average is -0.5. 
37	}
(gdb) c
Continuing.
[Inferior 1 (process 24420) exited normally]
(gdb) run
Starting program: /home/anand/workspace/github_stuff_public/Miscellaneous/C/activation_record_demo 
warning: no loadable sections found in added symbol-file system-supplied DSO at 0x7ffff7ffa000

Breakpoint 1, main () at activation_record_demo.c:12
12	    i = -8;
(gdb) s
13	    j = 7;
(gdb) s
16	    print_facts(i,j);
(gdb) frame
#0  main () at activation_record_demo.c:16
16	    print_facts(i,j);
(gdb) s
print_facts (num1=-8, num2=7) at activation_record_demo.c:28
28	    larger = max_of_two(num1,num2);
(gdb) frame
#0  print_facts (num1=-8, num2=7) at activation_record_demo.c:28
28	    larger = max_of_two(num1,num2);
(gdb) info frame
Stack level 0, frame at 0x7fffffffddb0:
 rip = 0x400566 in print_facts (activation_record_demo.c:28); 
    saved rip 0x400551
 called by frame at 0x7fffffffddd0
 source language c.
 Arglist at 0x7fffffffdda0, args: num1=-8, num2=7
 Locals at 0x7fffffffdda0, Previous frame's sp is 0x7fffffffddb0
 Saved registers:
  rbp at 0x7fffffffdda0, rip at 0x7fffffffdda8
(gdb) s
max_of_two (j=-8, k=7) at activation_record_demo.c:42
42	    if (j < k)
(gdb) s
43	        j = k;
(gdb) s
45	    return j;
(gdb) info frame
Stack level 0, frame at 0x7fffffffdd80:
 rip = 0x4005e6 in max_of_two (activation_record_demo.c:45); 
    saved rip 0x400575
 called by frame at 0x7fffffffddb0
 source language c.
 Arglist at 0x7fffffffdd70, args: j=7, k=7
 Locals at 0x7fffffffdd70, Previous frame's sp is 0x7fffffffdd80
 Saved registers:
  rbp at 0x7fffffffdd70, rip at 0x7fffffffdd78
(gdb) frame
#0  max_of_two (j=7, k=7) at activation_record_demo.c:45
45	    return j;
(gdb) up
#1  0x0000000000400575 in print_facts (num1=-8, num2=7)
    at activation_record_demo.c:28
28	    larger = max_of_two(num1,num2);
(gdb) frame
#1  0x0000000000400575 in print_facts (num1=-8, num2=7)
    at activation_record_demo.c:28
28	    larger = max_of_two(num1,num2);
(gdb) info frame
Stack level 1, frame at 0x7fffffffddb0:
 rip = 0x400575 in print_facts (activation_record_demo.c:28); 
    saved rip 0x400551
 called by frame at 0x7fffffffddd0, caller of frame at 0x7fffffffdd80
 source language c.
 Arglist at 0x7fffffffdda0, args: num1=-8, num2=7
 Locals at 0x7fffffffdda0, Previous frame's sp is 0x7fffffffddb0
 Saved registers:
  rbp at 0x7fffffffdda0, rip at 0x7fffffffdda8
(gdb) up
#2  0x0000000000400551 in main () at activation_record_demo.c:16
16	    print_facts(i,j);
(gdb) info frame
Stack level 2, frame at 0x7fffffffddd0:
 rip = 0x400551 in main (activation_record_demo.c:16); 
    saved rip 0x7ffff7a33ea5
 caller of frame at 0x7fffffffddb0
 source language c.
 Arglist at 0x7fffffffddc0, args: 
 Locals at 0x7fffffffddc0, Previous frame's sp is 0x7fffffffddd0
 Saved registers:
  rbp at 0x7fffffffddc0, rip at 0x7fffffffddc8
(gdb) down
#1  0x0000000000400575 in print_facts (num1=-8, num2=7)
    at activation_record_demo.c:28
28	    larger = max_of_two(num1,num2);
(gdb) down
#0  max_of_two (j=7, k=7) at activation_record_demo.c:45
45	    return j;
(gdb) down
Bottom (innermost) frame selected; you cannot go down.
(gdb) s
46	}
(gdb) s
print_facts (num1=-8, num2=7) at activation_record_demo.c:31
31	    the_avg = avg_of_two(num1,num2);
(gdb) n
34	    printf("For the tow integers %d and %d, n",num1,num2);
(gdb) n
For the tow integers -8 and 7, 
35	    printf("the larger is %d and the average is %g. n",larger,the_avg);
(gdb) n
the larger is 7 and the average is -0.5. 
37	}
(gdb) c
Continuing.

Note the output of the frame commands.
It print a number followed by an address, followed by the caller function, and the source code file and line no.
The next line prints out the actual source code at that line.

Interestingly look at the address differences when i move up the stack using up/down.

For ex:

(gdb) info frame
Stack level 0, frame at 0x7fffffffdd80:
 rip = 0x4005e6 in max_of_two (activation_record_demo.c:45); 
    saved rip 0x400575
 called by frame at 0x7fffffffddb0
 source language c.
 Arglist at 0x7fffffffdd70, args: j=7, k=7
 Locals at 0x7fffffffdd70, Previous frame's sp is 0x7fffffffdd80
 Saved registers:
  rbp at 0x7fffffffdd70, rip at 0x7fffffffdd78
(gdb) frame
#0  max_of_two (j=7, k=7) at activation_record_demo.c:45
45	    return j;
(gdb) up
#1  0x0000000000400575 in print_facts (num1=-8, num2=7)
    at activation_record_demo.c:28
28	    larger = max_of_two(num1,num2);
(gdb) frame
#1  0x0000000000400575 in print_facts (num1=-8, num2=7)
    at activation_record_demo.c:28
28	    larger = max_of_two(num1,num2);

Here, i move up the stack twice. The stack address which originally was at

0x7fffffffdd80

moves to

0x0000000000400575

, At first look it seems bizarre, till you realize there’s a return and the first stack address located function returns.

Deep C++ —- virtual table

Moving further along in that ridiculously long, too many slides unneccessarily pdf Deep C, we come across Deep C++ .


#include

struct X
{
int a;
char b;
int c;

};

int main(void)
{
std:: cout << sizeof(X) << std::endl;
return 1;

}

Now that code is no different from C by much(Except for those std::cout instead of printf) and so should print out the same size as padded structs in C. (see prev post)
So it does:

 g++ cpp_vtable.cpp  -o cpp_vtable
anand@anand-usb-boot:C [master] $ gdb64 cpp_vtable
Reading symbols from /home/anand/workspace/github_stuff_public/Miscellaneous/C/cpp_vtable...(no debugging symbols found)...done.

(gdb) run
Starting program: /home/anand/workspace/github_stuff_public/Miscellaneous/C/cpp_vtable 
warning: no loadable sections found in added symbol-file system-supplied DSO at 0x7ffff7ffa000
12

Now what happens if we add a member function to that struct? Let’s see:
Am adding this line to the end of the struct.

void set_value(int v) {a = v; }

Well it still prints the same 12. What does this mean? According to the pdf, normal member functions to C++ are just syntactic sugar and are equivalent to C functions with having an additional argument in the first position as being the struct which they are a member of.

That’s why we get this output:

 g++ cpp_vtable.cpp  -o cpp_vtable
anand@anand-usb-boot:C [master] $ gdb64 cpp_vtable
Reading symbols from /home/anand/workspace/github_stuff_public/Miscellaneous/C/cpp_vtable...(no debugging symbols found)...done.
(gdb) run
Starting program: /home/anand/workspace/github_stuff_public/Miscellaneous/C/cpp_vtable 
warning: no loadable sections found in added symbol-file system-supplied DSO at 0x7ffff7ffa000
12

OK that no loadable sections found warning from gdb is weird.. but otherwise the output is the same 12 as expected.

Now let’s go ahead and make it a virtual function instead.


virtual void set_value(int v) {a = v; }

And here’s the result of compiling and running it


anand@anand-usb-boot:C [master] $ g++ cpp_vtable.cpp  -o cpp_vtable
anand@anand-usb-boot:C [master] $ gdb64 cpp_vtable

(gdb) run
Starting program: /home/anand/workspace/github_stuff_public/Miscellaneous/C/cpp_vtable 
warning: no loadable sections found in added symbol-file system-supplied DSO at 0x7ffff7ffa000
24
[Inferior 1 (process 8290) exited with code 01]
(gdb) run
Starting program: /home/anand/workspace/github_stuff_public/Miscellaneous/C/cpp_vtable 
warning: no loadable sections found in added symbol-file system-supplied DSO at 0x7ffff7ffa000

Breakpoint 1, 0x0000000000400830 in main ()

Hmm we still get the same warning, but the size has been doubled. It must be the virtual table pointer. But how do we know, it’s not allocating memory separately for a virtual functions. Time to add another func to the source code.

So here’s how the new struct looks like:

struct X
{
int a;
char b;
int c;
virtual void set_value(int v) {a = v; }
virtual int get_value(int v) {return a; }
virtual void increase_value() { a++;}

};

And here’s the output of running the program.

anand@anand-usb-boot:C [master] $ g++ cpp_vtable.cpp  -o cpp_vtable
anand@anand-usb-boot:C [master] $ gdb64 cpp_vtable
GNU gdb (GDB) 7.5.91.20130417-cvs-ubuntu

(gdb) run
Starting program: /home/anand/workspace/github_stuff_public/Miscellaneous/C/cpp_vtable 
warning: no loadable sections found in added symbol-file system-supplied DSO at 0x7ffff7ffa000
24
[Inferior 1 (process 8353) exited with code 01]
(gdb) 

You can see that the size is the same as previous code 24. So it confirms if there’s a virtual function, there’s a pointer to a virtual table created as part of the structure. Infact, this same virtual table is used for verifying that inheriting classes, ensure overriding the virtual functions.

By inductive reasoning the next step is to simply add another struct to the code and see what happens.

Here’s what happens, if i copy paste the same struct as struct Y and keep only the get_value virtual function.

anand@anand-usb-boot:C [master] $ g++ cpp_vtable.cpp  -o cpp_vtable
anand@anand-usb-boot:C [master] $ gdb64 cpp_vtable
(gdb) run
Starting program: /home/anand/workspace/github_stuff_public/Miscellaneous/C/cpp_vtable 
warning: no loadable sections found in added symbol-file system-supplied DSO at 0x7ffff7ffa000
24
24
[Inferior 1 (process 8674) exited with code 01]
(gdb) quit

Just as expected it’s the same size, and is printed as we coded. :) . So each struct really does have a separate virtual table. We have nullified the hypothesis that all structs/classes share a virtual table.

Next, test would be to see if inherited classes shared a vtable or not, but i’ll save that for some other time.

Deep C- struct packing

Once again, from Deep C. The size of structs.

The basic idea is to test the knowledge that sometimes compilers do optimizations on how the structure is situated in memory. According to the pdf, this is because most hardware architectures are optimized for byte sized access, and so compilers generally pad structures and their elements so that they can be accessed in a byte wise manner. Now this may have been true sometime back, and perhaps still true in a statistical majority sense, but with the profusion of mobile computing devices, I suspect new architectures might have a more fine-grained(than word) access. Anyway,that’s marked for future research. Besides, given how much we have shrunk memory, that’s probably not the case for Smartphones and tablet. They probably still use word optimized architectures. I suppose the older (voice calls only) phones had architectures where this was false. I also think other microcontroller based devices like RSA Security code generators, active noise cancelling chips on headphones/earphones etc.. (aka, wherever memory is still a constraint) this would make a difference, and their architectures might not have inefficient access for sub-word memory accesses.

For now, I’ll just go on with the code and a demo here.


#include

struct X { int a; char b; int c;};

int main(void)
{
printf("%zun",sizeof(int));
printf("%zun",sizeof(char));
printf("%zun",sizeof(struct X));

}

And here’s the output with the minimal set of compilation flags.

anand@anand-usb-boot:C [master] $ gcc struct_padding.c -o struct_padding
anand@anand-usb-boot:C [master] $ ./struct_padding 
4
1
12

Now am running a 64-bit machine, but with 32-bit mode.(Am just reverse guessing since it printed 12, but doesn’t matter for our purposes.
As expected the struct is padded to 12 bytes. Instead of the sum of it’s parts, which is 4+1+4 = 9.

Now let’s add the -fpack-struct option to the gcc command.

anand@anand-usb-boot:C [master] $ gcc struct_padding.c -o struct_padding -fpack-struct
anand@anand-usb-boot:C [master] $ ./struct_padding 
4
1
9

Voila, there it is. the size as our mental model says it should be.
Let’s go add another char pointer to the struct

#include

struct X { int a; char b; int c; char *d;};

int main(void)
{
printf("%zun",sizeof(int));
printf("%zun",sizeof(char));
printf("%zun",sizeof(char *));
printf("%zun",sizeof(struct X));

}


Here’s the output with

anand@anand-usb-boot:C [master] $ gcc struct_padding.c -o struct_padding
anand@anand-usb-boot:C [master] $ ./struct_padding 
4
1
8
24

Here’s the output with -fpack-struct:

gcc struct_padding.c -o struct_padding -fpack-struct
anand@anand-usb-boot:C [master] $ ./struct_padding 
4
1
8
17

That 17 with pack-struct is expected. Note that the pointer size is 8 and without pack-struct it pads up from 20 to 24.

Another good link I came across is this

The Mountain Where Rain Never Falls

Math with Bad Drawings

The sixth in a series of seven fables/lessons/meditations on probability.

Another day of hiking brought the teacher and the student to an empty hut by a mountain stream. “We will rest here a while, and wash our clothes,” the teacher said.

When they had laid their clean clothes on sunny rocks to dry, the student pointed to the clouds gathering in the valley below. “Looks like rain. Should we be worried?”

“The rains have reached this place only once in the last 100 years,” the teacher said. “What is the probability that they will reach us today?”

View original post 825 more words