Did it ever happen to you that you would sit to try to solve a new problem, and the more you would think about it the less it would make sense to you? if you would do that at your desk, would you then be considered non-productive? if you were a game developer be it technical, artistic or manager, sitting there and not typing for hours without making any progress would that be bad? well, Bertrand Russell, one of the most famous logicians of all times did exactly that, so you are ok :)
S = {x : x is a set and x !∈ x}.
In other words, S is the set of all sets that do not contain themselves.
In more 'naive' words:
* In Seville, there’s a barber who shaves all those people who do not shave themselves. Does the
barber shave himself or not? This is known as the “Barber of Seville problem”.
* Imagine a card. On one side is written, “The statement on the other side of this card is true.” and
on the other side is written, “The statement on the other side of this card is false.”
Bertrand Russell, one of the most famous logicians ever, struggled with this problem for a long time. In his autobiography, he describes just how hard he found the problem. Every morning, he said, he would sit down at his desk with a blank piece of paper in front of him. At the end of the day, he would still be staring at the same blank sheet of paper.
Russell’s final resolution to the problem is described in his “Principia Mathematica”, written with Alfred North Whitehead, in which he introduced a “Theory of Types” to get around his paradox. The basic idea was this: sets cannot contain themselves....
http://www.geometer.org/mathcircles/nothing.pdf
.
Saturday, July 25, 2009
Tuesday, July 21, 2009
My steam gamer card, join to talk AI while blasting baddies :P
Yes son, you can compare apples to oranges...
One of the things that bothered me while tweaking and tuning the Keltis AI heuristics, was that ultimately things sometimes boiled down to the need to compare apples to oranges, unfortunately I do not remember the exact details and I am too lazy to dig them up, but I know that I had to compare values that I was not able to reduce to a common unit to measure by (like risk per example), it was really a matter of preference, this is not a new problem, and with my head in the details, I failed to notice the obvious, this is an old topic called utility that economists have been using for decades, of course, as usual, I said 'aha' just after getting my head out of the details and shipping.
It was no big deal though, I ended up using utility without knowing it.
Utility is 'the' way to compare apples to oranges, but what brings me to today's rant is that I remembered this while reading in the context of my ongoing current research in applying Reinforcement Learning to Animation planning.
The question in question is about a very valid question [ :) :D :P ] about the 'essence' of Reinforcement Learning (similar to http://rlai.cs.ualberta.ca/RLAI/rewardhypothesis.html) :
Is it sensible to treat all preferences as numeric rewards on a single scale? Theoretically, yes. There is a theorem (North [4]) that if you believe four fairly simple axioms about preferences, then you can derive the existence of a real-valued utility function. (The only mildly controversial axiom is substitutability: that if you prefer A to B, then you must prefer a coin flip between A and C to a coin flip between B and C.) Practically, it depends. Users often find it hard to articulate their preferences as numbers. (Example: you have to design the controller for a nuclear power plant. How many dollars is a human life worth?)source: http://www.eecs.umich.edu/~baveja/RLMasses/node5.html#SECTION00032000000000000000
I could not find the original in free electronic format: "D. W. North. A tutorial introduction to decision theory. IEEE Transactions on Systems Man and Cybernetics, SSC-4(3), Sept. 1968. "
If anyone can provide it I would be grateful, it is always very insightful to read about the essence of these things, this usually involves reading very old papers, and from my experience it is always worth it, it gives lots of confidence when applying things later and when doubts appear, because much thought and critical thinking went into each and every 'fact' we take today for granted, and for too naive tomorrow.
.
Saturday, July 11, 2009
Jad the Naive Mathematician, the absurdity of logic
Here I present my brain, it has been learning and evolving for some time, and recently, it noticed that, logically, the math it thought makes sense, actually doesn't.
The source of 'Math'
This goes some time back into the past, when I suddenly felt the urge to see where Math starts, because logically, and this is something I remember was the base of proving stuff, you need to base yourself on something that is true to prove something else. Anybody who knows a little bit about this knows that this directly leads to Axioms, Occam's razor, Goedel and co...
Useless education
Funny we have been thinking we know our very basic math, but we really do not even know that.
Even the pythagorean theorem seems not logical looking at it this way. Looking at proofs, the proofs themselves either use geometric manipulation of squares, triangles making assertions about areas, and some of those proofs came from periods were an area was something intuitive and not really formalized, come to think of it the concept of area itself is pretty much elusive, and looking for the rigorous math definition leads you to Reimann and others, and that's pretty recent in history. What's more annoying, I made it through school and a Bachelor in Engineering and I never once heard of them. What is even more annoying, I felt I knew what an 'area' is although, if I had thought critically and logically, I would have came ot the conclusion that there is something elusive about it, just like I did recently.
All of this post comes after lots of going back to trying to understand the roots of math, using wikipedia and google, some of this are listed in the bottom of the post.
Proof of a proof
One nice idea from this quest is Goedel and his Incompleteness theorem, naively for me right now it means you need to start from something to make any proofs, and that something you started from cannot be proved. I will not go back and read the details, but while taking a shower just now, I became curious as to how Goedel proved this, did he use an axiom as a base, if this axiom was removed, not even this could be proven? This got me to think about what logic is, and about the 'axioms' of logic. Logic seems to be something the brain can very easily accept and use as a base. Again going back to Engineering, much of what is left is the logic. But why? and what is logic, isn't it absurd by itself? what is the logic that logic is based on? Why does the brain readily accept it? (without 'proof').
A group of 'things', excluding 'Neo, the source'
This got me to realize that there is a certain group of things, that all fall into some category for which I don't have a name: Logic (needing logic to make sense), time (continuous/discrete), infinity, zero (1 over infinity!), space and it's size both endless and not being absurd (same for time). All these things feel like one and the same, or belonging to one category. We end up accepting them and even using them, but few of us really grasp them.
Think versus. Grasp
I also remembered vaguely something that I think Einstein said about things a human brain will never grasp, comparing to a table with eyes looking down never being able to see what is above it (I am not sure about the exactness of any of this). But what I recently found interesting, is the fact that we are able to think about these things, even though we might not be able to understand them (by construction?), why this separation? why can't we only think about things we can understand? Does this boundary mean something? and what?
Dump and live on
I wrote this post mainly for one reason: get it off my brain to free it for thinking about more practical stuff.
Feel free to express your opinion about this at
http://forums.aigamedev.com/showthread.php?p=15004#post15004
Some of the references
http://www.mathacademy.com/pr/prime/articles/fta/index.asp?LEV=&TBM=&TAL=&TAN=&TBI=&TCA=&TCS=&TDI=&TEC=&TFO=&TGE=&TGR=&THI=&TNT=&TPH=&TST=&TTO=&TTR=&TAD=
http://www.mathacademy.com/pr/prime/articles/irr2/index.asp
http://www.google.de/search?q=proof+square+root+of+2+is+irrational&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a
http://en.wikipedia.org/wiki/Well-order
http://en.wikipedia.org/wiki/Infinite_descent
http://en.wikipedia.org/wiki/Square_root_of_2
http://en.wikipedia.org/wiki/Rational_number
http://eu.wiley.com/WileyCDA/WileyTitle/productCd-0470211520.html
http://en.wikipedia.org/wiki/Commensurability_(mathematics)
http://www.boost.org/doc/libs/1_37_0/libs/math/doc/sf_and_dist/html/math_toolkit/special/ellint/ellint_intro.html
http://en.wikipedia.org/wiki/Elliptic_integral
http://sci.tech-archive.net/Archive/sci.math/2006-09/msg04719.html
http://books.google.de/books?id=RM1D3mFw2u0C&pg=PA7&lpg=PA7&dq=%22rigorous+definition+of+area%22&source=bl&ots=jiarfVKaP5&sig=OAi9X-H7Hnp92BdfIuiIA911KSc&hl=en&ei=jJdXSomENIed_AahldSdCQ&sa=X&oi=book_result&ct=result&resnum=7
http://www.amazon.co.uk/gp/offer-listing/0133459438/ref=dp_olp_1?ie=UTF8&qid=1247256551&sr=8-1
http://www.amazon.com/gp/product/images/0486439461/ref=dp_image_0?ie=UTF8&n=283155&s=books
http://www.amazon.com/s/ref=nb_ss_b?url=search-alias%3Dstripbooks&field-keywords=Discrete+Mathematics&x=0&y=0
http://www.mathkb.com/Uwe/Forum.aspx/math/16463/Concept-of-measure-in-undergraduate-mathematics
http://www.google.de/search?hl=en&safe=off&client=firefox-a&rls=org.mozilla%3Aen-US%3Aofficial&hs=iW1&num=100&q=%22rigorous+definition+of+area%22&btnG=Search
http://www.youtube.com/results?search_query=The+Fundamental+Theorem+of+Calculus&search_type=&aq=f
http://www.youtube.com/watch?v=MOnnMlMM70Q&feature=PlayList&p=D4E266DF4E3352B1&index=18
The source of 'Math'
This goes some time back into the past, when I suddenly felt the urge to see where Math starts, because logically, and this is something I remember was the base of proving stuff, you need to base yourself on something that is true to prove something else. Anybody who knows a little bit about this knows that this directly leads to Axioms, Occam's razor, Goedel and co...
Useless education
Funny we have been thinking we know our very basic math, but we really do not even know that.
Even the pythagorean theorem seems not logical looking at it this way. Looking at proofs, the proofs themselves either use geometric manipulation of squares, triangles making assertions about areas, and some of those proofs came from periods were an area was something intuitive and not really formalized, come to think of it the concept of area itself is pretty much elusive, and looking for the rigorous math definition leads you to Reimann and others, and that's pretty recent in history. What's more annoying, I made it through school and a Bachelor in Engineering and I never once heard of them. What is even more annoying, I felt I knew what an 'area' is although, if I had thought critically and logically, I would have came ot the conclusion that there is something elusive about it, just like I did recently.
All of this post comes after lots of going back to trying to understand the roots of math, using wikipedia and google, some of this are listed in the bottom of the post.
Proof of a proof
One nice idea from this quest is Goedel and his Incompleteness theorem, naively for me right now it means you need to start from something to make any proofs, and that something you started from cannot be proved. I will not go back and read the details, but while taking a shower just now, I became curious as to how Goedel proved this, did he use an axiom as a base, if this axiom was removed, not even this could be proven? This got me to think about what logic is, and about the 'axioms' of logic. Logic seems to be something the brain can very easily accept and use as a base. Again going back to Engineering, much of what is left is the logic. But why? and what is logic, isn't it absurd by itself? what is the logic that logic is based on? Why does the brain readily accept it? (without 'proof').
A group of 'things', excluding 'Neo, the source'
This got me to realize that there is a certain group of things, that all fall into some category for which I don't have a name: Logic (needing logic to make sense), time (continuous/discrete), infinity, zero (1 over infinity!), space and it's size both endless and not being absurd (same for time). All these things feel like one and the same, or belonging to one category. We end up accepting them and even using them, but few of us really grasp them.
Think versus. Grasp
I also remembered vaguely something that I think Einstein said about things a human brain will never grasp, comparing to a table with eyes looking down never being able to see what is above it (I am not sure about the exactness of any of this). But what I recently found interesting, is the fact that we are able to think about these things, even though we might not be able to understand them (by construction?), why this separation? why can't we only think about things we can understand? Does this boundary mean something? and what?
Dump and live on
I wrote this post mainly for one reason: get it off my brain to free it for thinking about more practical stuff.
Feel free to express your opinion about this at
http://forums.aigamedev.com/showthread.php?p=15004#post15004
Some of the references
http://www.mathacademy.com/pr/prime/articles/fta/index.asp?LEV=&TBM=&TAL=&TAN=&TBI=&TCA=&TCS=&TDI=&TEC=&TFO=&TGE=&TGR=&THI=&TNT=&TPH=&TST=&TTO=&TTR=&TAD=
http://www.mathacademy.com/pr/prime/articles/irr2/index.asp
http://www.google.de/search?q=proof+square+root+of+2+is+irrational&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a
http://en.wikipedia.org/wiki/Well-order
http://en.wikipedia.org/wiki/Infinite_descent
http://en.wikipedia.org/wiki/Square_root_of_2
http://en.wikipedia.org/wiki/Rational_number
http://eu.wiley.com/WileyCDA/WileyTitle/productCd-0470211520.html
http://en.wikipedia.org/wiki/Commensurability_(mathematics)
http://www.boost.org/doc/libs/1_37_0/libs/math/doc/sf_and_dist/html/math_toolkit/special/ellint/ellint_intro.html
http://en.wikipedia.org/wiki/Elliptic_integral
http://sci.tech-archive.net/Archive/sci.math/2006-09/msg04719.html
http://books.google.de/books?id=RM1D3mFw2u0C&pg=PA7&lpg=PA7&dq=%22rigorous+definition+of+area%22&source=bl&ots=jiarfVKaP5&sig=OAi9X-H7Hnp92BdfIuiIA911KSc&hl=en&ei=jJdXSomENIed_AahldSdCQ&sa=X&oi=book_result&ct=result&resnum=7
http://www.amazon.co.uk/gp/offer-listing/0133459438/ref=dp_olp_1?ie=UTF8&qid=1247256551&sr=8-1
http://www.amazon.com/gp/product/images/0486439461/ref=dp_image_0?ie=UTF8&n=283155&s=books
http://www.amazon.com/s/ref=nb_ss_b?url=search-alias%3Dstripbooks&field-keywords=Discrete+Mathematics&x=0&y=0
http://www.mathkb.com/Uwe/Forum.aspx/math/16463/Concept-of-measure-in-undergraduate-mathematics
http://www.google.de/search?hl=en&safe=off&client=firefox-a&rls=org.mozilla%3Aen-US%3Aofficial&hs=iW1&num=100&q=%22rigorous+definition+of+area%22&btnG=Search
http://www.youtube.com/results?search_query=The+Fundamental+Theorem+of+Calculus&search_type=&aq=f
http://www.youtube.com/watch?v=MOnnMlMM70Q&feature=PlayList&p=D4E266DF4E3352B1&index=18
Subscribe to:
Posts (Atom)