[Short, Meta] Should open threads be more frequent?

Sorry about not having units, I added code to set them but apparently it was the wrong code and I wasn't paying enough attention.

Green line is total comments, blue is top level comments. X-axis is minutes, y axis is number of comments.

[Short, Meta] Should open threads be more frequent?

So I did what you suggested and plotted the number of top level posts and total posts over time. The attached graph is averaged over the last 20 open threads. Code available here: https://gist.github.com/TRManderson/6849ab558d18906ede40

I don't trust myself to do any analysis, so I delegate that task to you lot.

EDIT: Changed GitHub repo to a gist

57ySorry about not having units, I added code to set them but apparently it was the
wrong code and I wasn't paying enough attention.
Green line is total comments, blue is top level comments. X-axis is minutes, y
axis is number of comments.

6[anonymous]7yCould you add labels so stupid people like me could figure out what that graph
means? I'm guessing the blue line is top level comments and the green line is
total comments; both averaged from all the weekly open threads. The y axis would
be number of posts and the x axis should be time but I'm not sure what unit it's
in. Also, why does the blue line suddenly stop?

77yFor those who don't want to guess or dig into the source, the missing x axis
unit is "minutes".

Open thread, September 16-22, 2013

That's not quite the law of the excluded middle. In your first example, leaving isn't the negation of buying the car but is just another possibility. Tertium non datur would be *He will either buy the car or he will not buy the car*. It applies outside formal systems, but the possibilities outside a formal system are rarely negations of one another. If I'm wrong, can someone tell me?

Still, planting the "seed of destruction" definitely seems like a good idea, although I'd think caution in specifying only one event where that would happen. This idea is basically ensuring beliefs are falsifiable.

Open thread, September 16-22, 2013

Does the average LW user actually maintain a list of probabilities for their beliefs? Or is Bayesian probabilistic reasoning just some gold standard that no-one here actually does? If the former, what kinds of stuff do you have on your list?

18yIt's a gold standard - true Bayesian reasoning is actually pretty much
impossible in practice. But you can get a lot of mileage off of the simple
approximation: "What's my current belief, how unlikely is this evidence, oh hey
I should/shouldn't change my mind now."
Putting numbers on things forces you to be more objective about the evidence,
and also lets you catch things like "Wait, this evidence is pretty good - it's
got an odds ratio of a hundred to one - but my prior should be so low that I
still shouldn't believe it."

18yWith actual symbols and specific numbers? no. But I do visualize approximate
graphs over probability distributions over configuration spaces and stuff like
that, and I tend to use the related but simpler theorems in fermi calculations.

38yWhat ArisKatsaris said is accurate - given our hardware, it wouldn't actually be
a good thing to keep track of explicit probabilities for everything.
I try to put numbers on things if I have to make an important decision, and I
have enough time to sit down and sketch it out. The last time I did that, I
combined it with drawing graphs, and found I was actually using the drawings
more - now I wonder if that's a more intuitive way to handle it. (The way I
visualize probabilities is splitting a bar up into segments, with the length of
the segments in proportion to the length of the whole bar indicating the
probability.)
One of my friends does keep explicit probabilities on unknowns that have a big
affect on his life. I'm not sure what all he uses them for. Sometimes it gets...
interesting, when I know his value for an unknown that will also affect one of
my decisions, and I know he has access to more information than I do, but I'm
not sure whether I trust his calibration. I'm still not really sure what the
correct way to handle this is.

Does the average LW user actually maintain a list of probabilities for their beliefs?

Or is Bayesian probabilistic reasoning just some gold standard that no-one here actually does?

It isn't really possible since in many cases it isn't even computable let alone feasible for currently existing human brains. Approximations are the best we can do, but I still consider it the best available epistemological framework for reasons similar to those given by Jaynes.

If the former, what kinds of stuff do you have on your list?

Does the average LW user actually maintain a list of probabilities for their beliefs? Or is Bayesian probabilistic reasoning just some gold standard that no-one here actually does?

People's brains can barely manage to multiply three-digit numbers together, so no human can do "Bayesian probabilistic reasoning". So for humans it's at best "the latter while using various practical tips to approximate the benefits of former" (e.g. being willing to express your certainty in a belief numerically when such a number is asked for you in a discussion).

Yet more "stupid" questions

Thanks. Just going to clarify my thoughts below.

Because doing so will lead to worse outcomes on average.

In specific instances, avoiding the negative outcome might be beneficial, but only for that instance. If you're constantly settling for less-than-optimal outcomes because they're less risky, it'll average out to less-than-optimal utility.

The terminology "non-linear valuation" seemed to me to imply some exponential valuation, or logarithmic or something; I think "subjective valuation" or "subjective utility" might be better here.

08yYou just incorporate that straight into the utility function.
You have $100 to your name. Start with 100 utility.
Hey! Betcha $50 this coin comes up heads!
$150 and therefore 110 utility if you win.
$50 and therefore 60 utility if you lose.
So you don't take the bet. It's a fair bet dollar wise, but an unfair bet
utility wise.

08yYes, non-linear valuation means that your subjective value for X does not
increase linearly with linear increases in X. It might increase logarithmically,
or exponentially, or polynomially (with degree > 1), or whatever.

Yet more "stupid" questions

Is there any reason we don't include a risk aversion factor in expected utility calculations?

If there is an established way of considering risk aversion, where can I find posts/papers/articles/books regarding this?

28yBecause doing so will lead to worse outcomes on average. Over a long series of
events, someone who just follows the math will do better than someone who is
risk-averse wrt to 'utility'. Of course, often our utility functions are
risk-averse wrt to real-world things, because of non-linear valuation - e.g,
your first $100,000 is more valuable than your second, and your first million is
not 10x as valuable as your first $100,000.

NEW TIME: Sydney Less Wrong meetup, 23/4, 3PM

Just found this in a search for "Brisbane". I'd show up, and maybe bring a friend who is a non-LW rationalist.

More "Stupid" Questions

It's likely that Eliezer isn't tending towards either side of the nature vs. nurture debate, and as such isn't claiming that nature or nurture is doing the work in generating preferences.

Beautiful Math

Neither finite differences nor calculus are new to me, but I didn't pick up the correlation between the two until now, and it really is obvious.

This is why I love mathematics - there's always a trick hidden up the sleeve!

Welcome to Less Wrong! (6th thread, July 2013)

Hey there LW!

At least 6 months ago, I stumbled upon a PDF of the sequences (or at least Map and Territory) while randomly browsing a website hosting various PDF ebooks. I read "The Simple Truth" and "What do we mean by Rationality?", but somehow lost the link to the file at some stage. I recalled the name of the website it mentioned (obviously LessWrong) from somewhere, and started trying to find it. After not too long, I came to Methods of Rationality (which a friend of mine had previously linked via Facebook) and began reading, but I... (read more)

The blue line suddenly stops because the last comment is posted at that time. I was kind of lazy about this graph and did have labels and a legend, but apparently I was too out of it to realise they didn't show on the png.

As said by gwillen, x axis is minutes.