Friday, April 29, 2011

Epsilon in $



"In the aluminum can industry, the thickness of the top of the pop-top cans has some physical limitations. It must be (say) not be thicker than 0.4mm, or else the poptop mechanism
may not work (many of us knows this from first hand experience). It must not be thinner than 0.3mm, or else the internal pressure may cause spontaneous explosion. Hence, we might tolerance this thickness as 0.35 ± 0.05mm. Now, if manufacturing processes were more reliable, and our ability to assess tolerance were more accurate, we might be able to tolerance this thickness as 0.32 ± 0.02mm. Such a change would save the aluminum can industry multi-millions of dollars in material cost each year".


"On 4 June 1996, the maiden flight of the 8 Ariane 5 launcher in French Guiana ended in self-destruction. About 40 seconds after initiation of the flight sequence, at an altitude of about 3700 m, the launcher veered off its flight path, broke up and exploded. A report from the Inquiry Board located the critical events: “At approximately 30 seconds after lift-off, the computer within the back-up inertial reference system, which was working on stand-by for guidance and attitude control, became inoperative. This was caused by an internal variable related to the horizontal velocity of the launcher exceeding a limit which existed in the software
of this computer. Approximately 0.05 seconds later the active inertial reference system, identical to the back-up system in hardware and software, failed for the same reason. Since the back-up inertial system was already inoperative, correct guidance and attitude information could no longer be obtained and loss of the mission was inevitable.”
The software error began with an overflow problem: “The internal SRI software exception was caused during execution of a data conversion from 64-bit floating point to 16-bit signed integer value. The floating point number which was converted had a value greater than what could be represented by a 16-bit signed integer. This resulted in an Operand Error. The data conversion instructions (in Ada code) were not protected from causing an Operand Error, although other conversions of comparable variables in the same place in the code were protected.”

Saturday, March 5, 2011

Spinozian Calculus.

http://www.yesselman.com/index.htm#Intro

Epsilon-Delta level up.


Me and Tom had a really good understanding of limits and continuity already, much better than we ever had before starting this project, but the FTC pushed as again the limits of our models, so we went back digging and getting an even deeper understanding.

Here is a part of my newest conclusions, as usual the full discussion is at: https://sites.google.com/site/77neuronsprojectperelman/weeks/week-19---5-6-proofs-ftc1-2
I went through Karl's Calculus pages again, the number system and limits parts. I am starting to live revising things even more. Things become clearer every time I read them, even if it is the same text.
I had a slight change of mental model after I did this, I will try to describe it.

Without thinking strictly about functions, limits are interesting places that really are related to a computation that is in seemingly impossible at infinite precision, at least at first sight.
They involve interesting places involving infinities, like the integral being a sum of an infinite large count of infinitely small numbers, or in a sequence of converging quantities, where we know we can get as close as we wish to some quantity yet to be found, or in derivatives, where we want to know if we can get as close as we like to some quantity to be found, starting with the ratio of 2 infinitely small quantities.
All interesting places.
Another point is that limits are about a relationship, the relationship between a domain (input), a range (output) and a function (a computational algorithm). I am finding it useful to visualize each of these separately instead of as a curve. The idea is that is that you take a piece of the domain, which is continuous by the very fact that we are dealing with reals and have a density (to be discussed!), and then taking at the piece of the range that the algorithm produces, again luckily, the reals make it even possible for that piece to be 'continuous' because the reals are closed under the operators found in the function, and check that piece of range out.
Is it connected just like the piece of domain is? is it not?
This is related to one other important point, it is not good to think of this as walking along the domain in some direction and looking at the range, specially not when we talking about limits around a specific point.
When we do this, the focus is on the point, so we start from that point in the domain and expand out, in the range, we start at the point's output and expand out, it is a subtle but important difference, specially that it made me realize how mixed together the multiple kinds of continuity in my mind. I always used to think that a function being continuous in calculus means continuous at every point, and whenever I was confronted with having to use this, I started thinking about some kind of induction logic to show that continuous at every point made all the points somehow connected. But there is also uniformly continuous, continuous at every point but not uniformly continuous, etc... "But first, let me answer a few questions that curiosity may be stuffing into your head at this very moment. In the example of u(x) we saw a function that was continuous everywhere except at one point. Is there a function that is continuous everywhere except at several points? Yes. Is there one that is continuous everywhere except at infinitely many isolated points? Yes. Is there one that is continuous nowhere at all? Yes. Is there one that is continuous only at a single point? Yes. Is there a function that is continuous everywhere yet cannot be graphed? Yes. And later on I shall give you examples of all of these. Click here if you want to skip ahead and see this optional material right now. Or you can continue reading and you will get to it eventually."

I have done some introspection and I came to some conclusions about what made me uncomfortable about epsilon-delta definitions.
Although I really did understand them they still felt strange in the sense that one would think that there must be a better, nicer, more usable way to express this, something that can be manipulated easily like algebra.
But after giving this some deep thought I came out even more convinced of Bishop's argument that it is common sense,
  1. They are a relationship between domain and range
  2. The relationship is TOTALLY dependent on the function, the algorithm that maps domain to range
This means, there is no general lazy way of having limits that solve themselves using mechanical manipulation, because it DEPENDS on the operator, this made me coin one term that I have been noticing the more I went into theory be it math, computer science, solving problems at work ... limits will always be in the category: among other categories.
The problem can be reduced to having to FIND a relationship between epsilon and delta that satisfies some constraints, or FIND a counter example. Like all other 'inverse' problems, where we know the goal but the path is still to be found, it is also in the category: .
Those are 2 uncomfortable categories, they do not allow for laziness, impatience, or lack of intuition and very solid understanding. It is a game...
But once this had crystallized in my mind like this, it suddenly began to feel much more comfortable.

Also, thinking about it like this, separating domain and range, looking at them separately, and looking at the problem in terms of sequence convergence for one point, started to convince that the epsilon-delta is the best definition because it does say in math exactly what we are saying.

I still have to wrap my investigate EVT and MVT and see how they are related to point continuity, uniform continuity, continuity in calculus. This will be the last piece of a level up.
I now see MVT and EVT as THE calculus formal tools that stem from the essence of epsilon-delta and can luckily be used instead of epsilon-delta in some cases to alleviate the need for hairy statements.
It is like this: you can build a basic airplane from pieces of metal and fuel, but if you are planning a war campaign you do not want to be planning on the level of detail of pieces of metal and fuel, you just start with airplanes, otherwize, while still possible, it becomes very hard very fast, ah, theorems are great building blocks, reducing everything to axioms will trash the caches of our tiny brains very fast. "

Not only does the rule make life easier, but without such rules, mathematics would be so thick with undergrowth as to make it virtually impossible to understand.""

Monday, February 21, 2011

Bigfoot update: Skeleton semantics, Footplant WIP.


I took some time today to work on Bigfoot (my animation research testbed) a bit.
The first thing I added was not animation related, my UI library did not support multi-sampling when rendering 3D to textures, now it does, and all my skeletons look happier.

On the animation side, I added some skeleton semantic detection code. Before, the skeletons were only analyzed for branches, chains of bones with one child only. Now it also find symmetries between branches, in the beginning of the video you can see how it detects that the left and right limbs are symmetrical. The next step is to give it some human skeleton knowledge, so that it automatically figures out what is a head, a foot, a leg, etc... The point of this is that it would enable running the code on large mocap databases without the need for human annotation for purposes like machine learning.

The other new feature which is still very much work in progress is footplant detection. While seemingly innocent, it can be quite tricky to get this right. Mocap is noisy and I also want to support the more general case of 'support contact', where for an animation of an athlete hanging on a bar per example, the contact points with the bar would be detected, or for an unrealistic animation of a martial arts kick after taking a few steps on a vertical wall, the steps on the wall would be registered as well. This needs a different technique than simply foot height. I am researching this slowly when I find myself needing a break from Mathematics and want to do something instantly gratifying.

In the video, green spheres are generated when there is a local minimum in joint height, blue ones when there is a local minimum in joint velocity and white for both.
You can see lots of them firing during footplants. I tried to filter the signals and that did improve the detection, but this is only the beginning, it needs to get much better.


Saturday, February 19, 2011

Friday, February 18, 2011

Is this 'limit replacement' property trivially true?


I have lately been bumping calculus proofs where this property of limits would be really useful, but I am not sure it is true even though it seems trivially true. I have just formulated it but not tried to prove it yet, I will try using the formal definition of limits. Comments welcome.

Thursday, February 17, 2011

Microscopically intuitive FTC#1, take 2.


Last week, I added a small paragraph at the bottom of my Fundamental Theorem of Calculus proof attempt, trying to cast an intuitive view on the theorem,
based on my observation that, when focusing on one tiny interval, proving the FTC and understanding it is intuitive.
The paragraph was rushed and did not contain a much needed figure, therefore when I showed it to Tom he could not make sense out of it although I thought it was a really nice insight.
While analyzing some FTC proofs, I became even more aware of how useful this intuition was.
I also become aware that some proofs simply require one to work out each detail of their intuition rigorously and patiently and not much more.
I decided to take the time to polish my insight, and see if I can manage to explain it better.
One thing I realized is that it takes it quite an amount text to explain even the simplest ideas if ones wants to do it right...



Limit over an interval


We are analyzing several FTC proofs to gain some insights. For now it seems all of them need analysis to be stated with enough detail to be convincing.
In this proof, I stumbled upon an assumption that can be reduced to claiming that this statement is true:

As usual, this is intuitively very true, the interval vanishes, leaving the 'sup' to act on only 'one point' if f is continuous. But that is no proof.

I have tried to detail this a bit more to see if I can prove it, the main idea behind my proof is: Courage.
I have found courage to be an essential component across many proofs and bold inventions in mathematics.
I am not sure how good it is, it feels pretty convincing, but there are 2 spots where it needs more detail, and I suspect that for these spots, there is an inescapable need for analysis (luckily we will be tackling that in the foreseeable future).

.

Sunday, February 13, 2011

Nested limits technicality



The last two weeks, we have been dealing with the Fundamental theorem of calculus and it's proofs. Both me and Tom created proofs that hang on annoying technicalities and because of that they do not hold.
One of the problems would be solved if we could prove an innocent statement about nested limits.
I have proven it in the following document, but this proof only holds if f is continuous, which is not the case in our proofs, I will look some more.

The proof does seems trivial, but we are realizing more and more the importance of the tiniest details in the relationships between continuousness, differentiability, integrability, and it is not always clear, specially in elementary calculus books where proofs are given a flimsy and vague treatment, we seem to be heading straight into analysis whether we like it or not (and we do!!)

Here is the same tiny proof as a pdf: http://jadnohra.net/release/math/nested_limits.pdf

Friday, February 4, 2011

Fundemental Theoreom of Calculus version 1

I managed a very 'weak' proof of FTC1, still it was fun and will come in handy when I see better proofs and how they solved the parts that I treated with too little formality.

I also included in a second part, yet another intuitive perspective on the theorem, using simple infinitesimal algebra.


.

Sunday, January 30, 2011

The extravagant burial of super-star Leibniz.

"Although today we recognize his contributions to be of outstanding importance, he died essentially neglected, and only his secretary attended his burial." (http://www.math.nmsu.edu/~history/book/leibniz.pdf)

Another interesting tidbit, the crucial importance of a mentor:
"In 1672 Leibniz was sent to Paris on a diplomatic mission, beginning a crucially formative four-year period there. Christian Huygens (1629–1695), from Holland, then the leading mathematician and natural philosopher in Europe, guided Leibniz in educating himself in higher mathematics, and Leibniz’s progress was extraordinary"

Feels like the equivalent of a one on one MsC in higher mathematics.

Yet another interesting piece of information, which underlines how everything is so simplified and post rationalized in a way that a lot of useful information is lost, is the fact that the now 'obvious' 'Fundemental Theoreom of Calculus' originally came from a publication by Leibniz (ignoring the Leibniz/Newton debate) called "Supplementum geometriae dimensoriae, seu generalissima omnium tetragonismorum effectio per motum: similiterque multiplex constructio lineae ex data tangentium conditione " or in English "More on geometric measurement, or
most generally of all practicing of quadrilateralization through motion: likewise many ways to construct a curve from a given condition on its tangents" publish in the scientific journal "Acta Eruditorum"

Yes, he called it 'a supplement' ... please teach the history of math!


Which brings me to the find of the month:
http://www.math.nmsu.edu/~history/ is a project that has a mission statement this is SO much in line with our attitude towards mathematics, and they even have books, stumbled upon it while reading about Leibniz.

Mission statement: "Our journey towards utilizing original texts as the primary object of study in undergraduate and graduate courses began at the senior undergraduate level. In 1987 we read William Dunham's ..."
.

Thursday, January 27, 2011

66 Points to score your shooter AI.


I present a table that tries to capture the amount of AI sophistication in current shooters.
It is based on my experience, conversations with AI programmers, reviews, user comments and gameplay videos.

The points are roughly sorted by difficulty of implementation with current standard techniques.

It has been laying on my disk for quite some time waiting for a proper article for which I am never finding the time, so I finally gave up and decided to release it in hope for it to be useful even in this summarized table format.



Thursday, January 20, 2011

The plagiarize series - Jan C. Willems - In Control, Almost from the Beginning Until the Day After Tomorrow


"The work involved in preparing publications comes for a large part at the expense of time to think. In science, more writing goes together with less reading. The sheer number of publications makes it also very difficult to get acquainted with, and evaluate a new idea.
I miss the emphasis on breadth and depth, on quality rather than quantity, on synthesis of ideas, on debate and scrutiny rather than passive attendance of presentations, and on reflection rather than activity.
Sure, euphoria bears creativity, and skepticism paralyzes. However, questioning and criticism is an essential part of science. I have seen too many high profile areas collapse under their own weight: cybernetics, world dynamics, general systems theory, catastrophe theory, and I wonder what the future has in store for cellular automata, fractals, neural networks, complexity theory, and sync."

"Life is what intrudes on you while you are learning mathematics" (Jad Nohra & Tom Lahore)

Monday, January 17, 2011