## Bundle Adjustemnt on the Mars with Rover

Just found out – Mars Rovers used bundle adjustment for its localization and rocks modeling:

“Purpose of algorithm:

To perform autonomous long-range rover localization based on bundle adjustment (BA) technology.

Processing steps of the algorithm include interest point extraction and matching, intra- and inter- stereo tie point selection, automatic cross-site tie point selection by rock extraction, modeling and matching, and bundle adjustment”

## New polymath project announced – deterministic way to find primes

New polymath project – massively collaborative mathematic project announced at the polymath blog – deterministic way to find primes – given an integer k, is guaranteed to find a prime of at least k digits in length of time polynomial in k.

## 10,000 Year Clock construction is moving forward

Via slashdot

And if you are reading this blog and havn’t read Anathem yet, you should give it try :)

## Randomness: our brain deceive us

Here two points distribution : one is random, and one is not:

Which is which ?

The thing is that left image is not random, and right is.

Sean Carroll from Cosmic Variance write:

“Humans are not very good at generating random sequences; when asked to come up with a “random” sequence of coin flips from their heads, they inevitably include too few long strings of the same outcome. In other words, they think that randomness looks a lot more uniform and structureless than it really does. The flip side is that, when things really are random, they see patterns that aren’t really there. It might be in coin flips or distributions of points, or it might involve the Virgin Mary on a grilled cheese sandwich, or the insistence on assigning blame for random unfortunate events.”

## Sorry, no warp drive

Star Treck-esque warp drive – Alcubierre Drive was a mathematical curiosity in the General Relativity Theory, which allowed for faster than light travel inside of the bubble of warped space-time. Of cause it had some problem, like that bubble of space-time could have been created only if some matter was already moving faster than light, and it required exotic matter, and it required three solar masses to transport a single atom. Now it looks like quantum mechanics finally put it out of misery. Exploring mechanism of creaton of warp bubble out of the flat space-tiem , using semiclassical approach Spanish-Italian team shown that energy of the front edge of the warp bubble would grow exponentially with time, which mean that warp drive would be unstable.

Via Slashdot

## Algebra and geometry

Something I’ve picked up at The n-Category Café. Algebra and geometry are analogous to syntax and semantic with syntax corresponding to algebra and semantics to geometry. This broad statement have precise meaning, which could be expressed as duality between Boolean algebras and specific topological spaces, which used in the study of formal semantics of computer languages.

## From financial crisis to image processing: Ignore Topology At Your Own Risk.

Very interesting article in Wired Recipe for Disaster: The Formula That Killed Wall Street . I’m not a statistician, but I’ll try to explain it. The gist of the article is that in the heart of the current financial crisis is the David X. Li formula, which use “Gaussian copula function” for risk estimation. The idea of formula is that if we have to estimate joint probability of two random events, it could be done with simple formula, which use only probability distributions of each event as if they were independent and a single parameter – statistical correlation. So what bankers did – instead of looking into relationships and connections between events they just did calculate one single statistical parameter and used it for risk estimation. Even more – they applied the same formula to the results of those relatively simple calculations and build pyramids of estimations, each next step applying the same simple formula to results of the previous step. As a result, an extremely complex behavior was reduced to the simple linear model, which had little in common with reality.

And now – the illustration from wiki, what exactly this single parameter – correlation is:

Here are several two-variable distributions and their correlation coefficients. It could be seen that for linear relationships correlation capture dependence of variables perfectly (middle raw). For upper row – normal distributions – it capture the essence of dependency. We can say something about other variable if we know one variable and correlation in that case. For complex shapes – lower row – correlation is zero for each. Each of the lower shapes will be represented as the upper central shape (fuzzy ball) with correlation. Correlation capture nil information about how one variable depend on another for the lower shapes. Correlation allow representation of any shape only as fuzzy ellipse. Li’s formula reduce dimensionality. The thing is, dimensionality – topological property, and you don’t mess with topological properties easily. Imagine bankers using fuzzy ball instead of ring for risk estimation…

Now to the image processing. Most of feature detection in image processing is done for grayscale image. Original image is usually RGB, but before features extraction it converted to grayscale.

However the original image is colored, why not to use colors for feature detection ? For example detect features in each color channel separately?

The thing is, the pictures in each color channel are very similar.

The extraction of blobs in each channel in most cases will triple the job without gaining of significant new information – all the channels will give about the same blobs.

Nevertheless it’s obvious, there is some nontrivial information about the image, encoded in colors.

Why blob detection for each color don’t give access to it ?

The reason is the same as for current financial crisis – dimensionality. Treating each color channel separately we replace five-dimensional RGB+coordinates space with three three-dimensional color+coordinates spaces. Relationships between color channels are lost. Topology of color structure is lost.

To actually use color information, statistical relationships between colors of the image should be explored – something like three dimensional color bins histogram, essentially converting image from RGB to indexed color.

## “Probabilistic” CMOS

I was intrigued by reports of ultra-efficient chips based on the probabilistic logic – PCMOS. After some googling I found this pdf, which clear the subject somehow. It seems probabilistic logic is not went into equation. Instead this architecture suggest normal, deterministic CPU with probabilistic coprocessor. Coprocessor use noise as source for random number generator (essentially analog random number generator), and can use this random number generator in different Monte-Carlo algorithms, like random neural networks, probabilistic cellular automata and likes. It seems to me the gain could be achived only for specific applications which use random number generators. In this PCMOS is not different from GPU, DSP and other task-specific accelerators.

## Polynesian stick charts were mapping wave patterns

Polynesian Stick Charts were completely different way of navigation, they were mapping not only locations, but also oceanic swells, patterns of waves.

Specific map encoding was closely guarded secret, known only to group of navigators who own them.

Navigating by the wave pattern navigator “would crouch in the bow of his canoe and literally feel every motion of the vessel.”They “concentrated on refraction of swells as they came in contact with undersea slopes of islands and the bending of swells around islands as they interacted with swells coming from opposite directions.”

Fascinating staff, kind of technology which could have been developed by alien, or in alternate history line.

## New evidence that our world is two-dimentional.

Retweet from @bruces

It seems a new evidence found, that our universe is not 3d but 2d surface. According to Holographic principle in in string theory physic inside some volume of space could be completely described by the theory restricted to the boundary surface of that volume. That is tree-dimensional physical processes inside the volume of space could be seen as illusory projection of the two-dimensional processes on the surface of that volume. The new development here is that as in quantum theory space-time should be granulated, that granulation should be primarily two-dimensional, on the surface of the volume. The space-time granulation inside the volume would be projection of the surface granulation. And – here the news – that projected granulation should be bigger than original surface granulation(plank length). That bigger granulation should create additional blur, or noise in physical effects. And it seems that noise exactly like that was detected at GEO600 gravitation waves detector