What I Learned Today

No frills, just learn

I read a lot of misconceptions this morning related to this article regarding Google Translate.  Is not properly fresh news but this morning in my telegram group @scienza, this other popularization article has been posted that completely misunderstood the premises of the original academic article (also the so-called informed comments are not really , so I decided to try to keep the record straight and offer a question.

In the article the approach is referred as a multitasking learning framework

Our approach is related to the multitask learning framework [4]. Despite its promise, this framework
has seen limited practical success in real world applications. In speech recognition, there have been many
successful reports of modeling multiple languages using a single model (see [17] for an extensive reference and
references therein). Multilingual language processing has also shown to be successful in domains other than
translation [9, 23].

where a neural network (NN) is trained on several tasks simply by implementing extra tokens (in this case the languages) in the input and ground truth layers. The NN will learn all the designated languages simultaneously by associating phrases with ground truth respect to designated linguistic points (BLEU scores) for the whole network and not disjunctly by direct correlation of two languages. In a sense, it probes the off diagonal degrees of freedom of the NN.

What a lot of people then started fantasising a little bit too much about is the "interlingua" process. My guy referred to it even as to a bytecode correspondence between languages that would be unfeasible and would defy too much the purpose of this NN: this has been tried to re-code as little as possible of the original Translate algorithm (that already includes all the semantic, glottology...etc... work of Google's researchers and engineers), and an eventual bytecode would be not flexible and would require a complete overhaul of the code!

What Google Researchers have seen in the NN, is that the same phrase clusterize in different languages according to a specific metric: t-distributed stochastic neighbor embedding, which is a sophisticated projection which reduces dimensionality while preserving pairwise distances (thus creating a low-dimensional metric space).

I wonder: what if I change metric, would I be able to clusterize, according to another metric, the languages?

CP Symmetry violation

Matter-Antimatter symmetry violation might have found it's first culprit and have been recently published on Nature (here's the arXiv https://arxiv.org/abs/1609.05216 )

Together with the results of AMS (pasted) have been an interesting 2016 for big collaborations.

What can nuclear physics contribute to such big and fancy experiments and new quests for dark matter or CP symmetry? All these experiments use nuclei at one point or another of their experimental chain, all of them use the strong force and its low-energy extrapolation...

The First Five Years of the Alpha Magnetic Spectrometer on the International Space Station

Nuclear wave function in function of r for the proton p_{3/2} state, obtained by diagonalizing the self energy in the exact continuum used above. The first eigenvalue at -23.97 MeV (solid red line) is compared by the corresponding 0p_{3/2} harmonic oscillator state multiplied by the square root of the spectroscopic factor (dashed red line), the second eigenvalue at -2.85 MeV (solid blue line) is similarly compared with 1p_{3/2} harmonic oscillator state (dashed blue line). The spectroscopic factors are 0.78 and 0.21 respectively

A IMHO good picture to didactically illustrate the effect of many-body dynamics on nuclei.

I post it here since that it did not make it to the final version of the Proceeding for Pisa conference, but I think it illustrate nicely how, from the combination of several base state wavefunctions we build many-body wavefunctions which have different properties (are quenched, so posses spectroscopic factors, and have a completely different tail) from the harmonic oscillator starting point (dashed).

This is also why reaction dynamics are way different calculated with a complete set of many-body relation instead of a simple mean field picture.

IMHO would also be difficult to reproduce the richness of these p_{3/2} wavefunctions with a single Wood Saxon, let alone an harmonic oscillator potential (you can notice that the decay of the 1p_{3/2} is quite fat respect to the exponential of an harmonic oscillator eigenfunction).

Non-local optical potentials are the way to go! 😉

(absolutely objective and no conflict of interest there), more info here: arXiv:1612.01478 [nucl-th], and soon-to-be publication.

Enjoy.

Github returns

I am setting up and using the public github repository after many years. I decided to start releasing some old projects publicly, since seems apparent I will never have time to write anymore papers, nor to polish the code.

Eventually, at least for these little "didactic" subprojects, I will start to do little explanatory videos, or blogposts, instead of articles just to save times and make the process more fluid.

So, with the philosophy of releasing as process, not as product, here it is the first project on Pairing Vibration RPA:

https://github.com/AndreaIdini/PairingVibration

Enjoy and let me know.

Macbook Pro 2016

Macbook Pro 2016 update sports still LPDDR3 RAM memory, in 8GB or 16GB size, and 1866 or 2133 MHz CL11 or CL12.

Frankly I'm furious that a 2300£ (starting from) laptop still uses a 9 years old standard, while a newer one is available since more than a year.

Already the Skylake processor is a setback, but the DDR3 memory on a Intel graphics laptop is practically a scam.

Dell XPS15 offers better specs since June, in roughly the same size.

Good riddance on the Mac sector, Apple, continue to produce expensive toys...

C# Lists and Dictionary

In C# there are several key differences to C++. Microsoft Website spells them all: https://msdn.microsoft.com/en-us/magazine/cc301520.aspx

Garbage collector instead of deconstructors providing a different approach to micromenagment of memory and lists, jagged arrays and dynamic memory allocation are already available into the language. It supports also negative index to counting from the last element, similarly to Python.

About lists instead of the usual self-made textbook-example class, there are specific embedded classes for Lists and Dictionaries.

List<type> name = new List<type>();
name.Add(element);
name.Remove(element);

Dictionary<type,type> name = new Dictionary<type,type>();
name.Add(key,element);
name.Remove(key,element);