\(B\)-meson \(b\)-\(s\)-\(\mu\)-\(\mu\) anomaly remains at 4.9 sigma after Moriond
28 March 2017 | 1:00 pm

There was no obvious announcement of new physics at Moriond 2017, one that would have settled supersymmetry or other bets in a groundbreaking direction, but that doesn't mean that the Standard Model is absolutely consistent with all observations.

In recent years, the LHCb collaboration has claimed various deviations of their observations of mostly \(B\)-meson decays from the Standard Model predictions. A new paper was released yesterday, summarizing the situation after Moriond 2017:
Status of the \(B\to K^*\mu^+\mu^−\) anomaly after Moriond 2017
Wolfgang Altmannshofer, Christoph Niehoff, Peter Stangl, David M. Straub (the German language is so effective with these one-syllable surnames, isn't it?) and Matthias Rindfleischetikettierungsüberwachungsaufgabenübertragungsgesetz have looked at the tension with the newest data.



The Good-lookers, Matterhorn (1975): In the morning, they started their journey at CERN (or in Bern). I've made the would-be witty replacement of Bern with CERN so many times that I am not capable of singing this verse reliably correctly anymore!

The new data include the angular distribution of the decay mentioned in the title, as measured by the major (ATLAS and CMS) detectors.




Microscopically, at the level of quarks and leptons, these decays of the \(B\)-mesons correspond to the\[

b\to s + \mu^+ + \mu^-

\] transformation of the bottom-quark.




There seems to be a deviation from the Standard Model. But they see that the deviation doesn't seem to visibly depend on \(q^2\) and it's independent of the helicities, too. The first fact encourages them to explain the "extra processes" by an extra four-fermion interaction including the fermions \(b,s,\mu,\mu\). There are various tensor structures that allow you to contract the four spinors in the four-fermion interactions and once they look carefully, the deviation from the Standard Model seems to be maximally hiding in the new physics (NP) term in the Hamiltonian:\[

\eq{
\HH_{\rm eff} &= -\frac{4 G_F}{\sqrt{2}} V_{tb} V^*_{ts} \frac{e^2}{16\pi^2} \cdot C_9 O_9 + {\rm h.c.},\\
O_9 &= (\bar s \gamma_\mu P_L b) (\bar \ell \gamma^\mu \ell)
}

\] There are numerous other possible terms a priori, up to \(O_{10}\). Also, analogous operators may have primes and the prime indicates the replacement of \(P_L\) with \(P_R\).



If you memorize this song about quarks, you should understand all the four-fermion interactions unless you will conclude that the song is about cheese, as one of the singers did. The ladies from the girl band – those on the first photograph ever posted on the web – are planning a comeback and look for donations.

At any rate, only the evidence in favor of a nonzero coefficient \(O_9\) from new physics seems strong enough to deserve the paper – and the TRF blog post – and the best fit value of \(C_9\) seems to be negative and\[

C_9 = -1.21 \pm 0.22

\] which means that the experimental data indicate that \(C_9\) is nonzero (it should be zero in the Standard Model) at the 4.9-sigma level. Not bad. Well, there is also a similar but weaker anomaly for \(C_{10}\) that multiplies a similar operator with an extra \(\gamma_5\) and whose best fit is:\[

\eq{
O_{10} &= (\bar s \gamma_\mu P_L b) (\bar \ell \gamma^\mu \gamma_5 \ell)\\
C_{10} &= +0.69\pm 0.25
}

\] which differs from the Standard Model's zero by 2.9 sigma. The numbers make it clear that the hypothesis that \(C_{9}=-C_{10}\) is rather compatible with the data, too, within one sigma, and the best fit for this \(C_{9}=-C_{10}\) is \(-0.62\pm 0.14\) or so, a 4.2-sigma deviation from zero (I believe that \(-0.62\pm 0.14\) should really be multiplied by \(\sqrt{2}\) but let me not make this confusion too visible).

The German/Ohio authors translate this effect to various other parameterizations of the LFUs (lepton flavor universality parameters) and if I understand the ultimate claim well, they basically say that similar anomalies from ATLAS+CMS, LHCb, and Belle seem to be consistent with each other and with the extra new physics term that was proposed above.

Some skeptics could say that these anomalies could be due to some difficult QCD effects. But the bottom-quark is pretty heavy and therefore "ignoring" the gluy, sticky environment around itself so I tend to think that the deviation from the Standard Model is rather exciting.



I've made fun of the German language so I want to make sure that the U.S. readers don't think that they're untouchable. ;-)

If it exists, the authors say, the clear deviations from the Standard Model could be made very strong by the experiments very soon.

Theoretically, I would try to explain this four-fermion interaction by the exchange of a new gauge boson or a scalar particle but I am not capable of giving you a more refined let alone stringy inspired detailed story about this new effect at this moment.

Pizza and simulations vs renormalization
27 March 2017 | 12:16 pm

Physicist Moshe Rozali has challenged Aaronson's fantasies about the simulation of the Universe. Let me begin with his traditionalist complaints that are more comprehensible, to make sure that the number of readers of this blog post will monotonically decrease with time:
Incidentally, my main problem with the simulation story is not (only) that it is intellectually lazy or that it is masquerading as some deep foundational issue. As far as metaphysical speculation goes it is remarkably unromantic, I mean, your best attempt as a creation myth involves someone sitting in front of a computer running code? What else do those omnipotent gods do, eat pizza? Do their taxes?
Right. The "universe as a computer simulation" should be viewed as a competitor of Genesis and in this competition struggle, the "simulation" loses to Genesis because it's a superficial kitschy fad, an uninspiring work of socialist realism.



Genesis according to Scott Aaronson. I don't want to revolt against our overlords but the sticky fingers just suck, Ms Simulator. Incidentally, the pizza is a computer case. Click at the picture to see a video by Aaronson's twin brother who explains all the details.




Aaronson responded as follows:
You should at least credit it with being a creation myth for our century. Nowadays, it’s hard to be so impressed with stories about gods battling each other with axes or bequeathing humans the gift of fire: why don’t they just use nuclear weapons, and hand out Bic lighters?
You can see a difference in their tastes. Moshe Rozali is a male feminist – beware male feminists – but he still has some respect towards the traditions and immunity against the cheesiest fads of the day. After all, the Bible has been around for over 2,000 years and there's no good reason to think that "the universe as a simulation" will come close. On the other hand, Aaronson enthusiastically embraces the P.R. of the day. The Creator should be one of us, a community organizer with dirty hands from pizza and stinky nose from cigarettes that he lights by Bic lighters, someone who babbles about nuclear weapons even though he hasn't ever held an ordinary axe in his hand.

Sorry but I don't need all axes in novels, theater plays, and movies to be replaced with nukes and I think that the people such as Aaronson who simply have to replace all the old tools by some fashionable or contemporary ones have an extremely bad taste.




Moshe's real opinion is somewhat ambiguous but in between the lines, I think that Moshe agrees that this new idea "what heroes and gods should look like now" is rather disappointing. Moshe wrote:
Oh, I could imagine many powers I’d want to bestow on my creator (or vice versa), but imagining your deity as someone no better than yourself, with no special powers or insight, does seem like a good creation myth for this century.
And the picture of God as the "average bloke" will get even more typical for the 22nd century if the mankind keeps on evolving towards the idiocracy which is what it seems like now.

OK, those were the less technical comments. The rest is – and the first comment by Moshe was – about the renormalization and related issues.

You know, Moshe basically says that the computer scientists and players of video games who say "it's straightforward to simulate the Universe" start with the naive expectation
that the observables you calculate have a finite continuum limit, so at every value of the cutoff you approximate them to a finite precision.
In other words, just like one can shoot a scene on a camera with a certain resolution, these naive people are imagining that physics in the spacetime may be obtained simply by discretizing the spacetime using a lattice of lattice spacing \(a\) and taking \(a\to 0\). All deviations from the "perfect smooth world" go to zero in the limit \(a\to 0\), they think.

Well, it's not the case in modern physics. The deviation of the "quantities computed in the discretized approximation" and the "idealized finite quantities in the smooth real world" actually differ by terms that go to infinity for \(a\to 0\). These unwelcome "infinities" have to be subtracted in the definition of the theory. Moreover, at the very end, we must only look at the observables (operators) for which some continuum limit exists at all. And it won't exist for everybody.

So when you simulate the world using some very small lattice spacing \(a\), most of the quantities in your computer program will be divergent, dominated by terms such as the inverse powers \(k / a^m\) for some positive exponent \(m\) where the coefficient \(k\) has pretty much nothing to do with the interesting dynamical observables that describe the "world as we normally understand it". All these leading terms have to be subtracted in some way. If you're lucky, it can be done and the much smaller deviations from these things will correspond to the density of electromagnetic energy in the field or any other quantity you want to talk about.

And the final outcome "it can be done after lots of work" is the lucky one which is not guaranteed. There are rather deep problems with the discretization of certain aspects of physics. In many of them, physicists remain uncertain "whether it may be done at all", even if you decide to make an arbitrarily huge amount of work. The classic problem of this kind are chiral fermions on a lattice. I think that if you organized a poll among the lattice gauge theory experts, you would get rather split answers to the question whether "the general theories with chiral fermions may be completely accurately and universally computed by lattice methods" at all.

All known elementary fermions – leptons and quarks – are chiral i.e. left-right-asymmetric. The part of the field that evolves like a left-handed screw behaves differently than the right-handed part. They have different electroweak interactions. It's difficult to get this feature from the lattice because a lattice – e.g. a cubic lattice – is clearly left-right-symmetric. At the end, the very basic fact that the laws of physics are not left-right-symmetric – which has been known for more than half a century – is morally incompatible with the very idea of a discretization or a lattice. The observed violation of the CP-symmetry makes the things even worse or harder for the lattice.

Even if you succeed to emulate chiral fermions using a lattice, you face additional problems such as the gauge anomalies. In theoretical physicists' jargon, anomalies are quantum effects that violate classical symmetries – including gauge symmetries – that should hold naively. But the switch to the quantum theory makes it hard to obey all the symmetries at the same moment and when you add a generic collection of chiral fermions, quantum mechanics strictly implies that the symmetries just can't be preserved in the quantum theory. The explanation of all these things in terms of the discretized, lattice formalism is very hard.

Let me mention the Casimir effect. Conductive parallel plates at distance \(A\) are predicted to attract with the force per unit area\[

{F_c \over A}=-\frac{d}{dA} \frac{\langle E \rangle}{{\rm Area}} = -\frac {\hbar c \pi^2} {240 A^4}

\] This force is calculated as a derivative of the energy \(E\) of quantum fluctuations of the electromagnetic field. The simplest similar example is one in string theory where the string carries some zero point energy proportional to\[

1+2+3+4+5+\dots = -\frac{1}{12}.

\] Uneducated people often love to say that it's nonsense and they don't have to pay attention to string theory because a famous crackpot in their city told them so. Well, these ideas don't depend on string theory in any way. You may talk about the well-known 3+1-dimensional world and the Casimir force between the parallel plates that has been experimentally verified. The theoretically calculated energy \(E\) in the formula above ends up being proportional to the sum\[

1^3+2^3+3^3+4^3+5^3+\dots = \zeta(-3)= +\frac{1}{120}.

\] You can see that it's totally analogous to the sum of positive integers except that we get the sum of cubes of positive integers instead (you get them from summing over momenta \(\vec k\) or the corresponding Fourier modes of the electromagnetic modes in between the plates) – the third power appears because we have three spatial dimensions, it's no coincidence. Well, the sum is equal to a positive number in this case but a finite one and not an integer. In this case, it's \(+1/120\).

Now, just to be sure, some physicists would agree with me that it's morally right to write that the naively divergent sum of the cubes is equal to \(+1/120\). Others would say that the equation is just heuristic and it isn't true literally and they would offer fixes. But what are the fixes? These fixes would include various additions and complications and all of those – with the exception of the finite term \(+1/120\) – would exactly cancel at the end, whenever you would calculate a physically meaningful quantity.

There are many ways to calculate the "regulated" sum of the third powers of the positive integers. They are analogous to the ways to calculate the sum of integers. The cancellations work in various ways and nothing ultimately depends on the way you choose. So the finite residual term \(+1/120\) is the "only thing" that these discretizations and other "rigorous justifications" have in common. For this reason, it makes sense to say that \(+1/120\) is the only physical part of the sum and everything else is an unphysical artifact.

But in a computer simulation that tries to discretize physics, these unphysical artifacts completely dominate. Most of your RAM memory would contain "almost infinite", unphysical numbers of this form. Let us look at yet another elementary enough example: the density of the electromagnetic energy in the field in our Universe – which we try to simulate.

In a 2012 blog post about the Feynman's path integral explanation of the uncertainty principle, I derived that the generic trajectory contributing to the path integral for non-relativistic particles has the velocities of order\[

\Delta v \sim \frac{\sqrt{\hbar}}{\sqrt{\Delta t \cdot m}}

\] where \(\Delta t\) is the minimum time in our "discretization of time", \(m\) is the particle mass, and \(\hbar\) is the reduced Planck's constant. You may see that in the continuum limit \(\Delta t\to 0\), the velocity of the particle is infinite at each point. Almost all trajectories – according to the Feynman's path-integral measure – are non-differentiable almost everywhere. And this fact (perhaps "ugly fact" according to some people's arbitrary aesthetic judgement) is absolutely essential for the path integral not to contradict the Heisenberg uncertainty principle, the defining principle of all of quantum mechanics.

The same argument may be derived in \(D\)-dimensional spacetimes and the corresponding velocities of the bosonic quantum fields, such as the electric and magnetic vectors \(\vec E\) and \(\vec B\), will scale like\[

\abs{ \vec E } \sim \frac{1}{(\sqrt{\Delta t})^D}

\] It's no coincidence that the power of \(\sqrt{\Delta t}\) is the same one that you obtain from the dimensional analysis assuming the canonically normalized kinetic terms in the action. Just to be sure, the world around us has\[

D=4

\] large spacetime dimensions, so \[

\abs{\vec E} \sim \frac{1}{(\Delta t)^2}

\] and the magnetic vector \(|\vec B|\) scales in the same way. What happens if you substitute it to the density of electromagnetic energy?\[

\rho = \frac{ |\vec E|^2 + |\vec B|^2 }{2}

\] You will obviously get\[

\rho\sim \frac{1}{(\Delta t)^4}

\] The field density of the electromagnetic field energy diverges and scales in this way. Imagine that you have a computer program that discretizes the reality in a similar way and you want to know what is the density of the radio waves coming from a nearby antenna or something like that. You would think that the answer is proportional to the density of the electromagnetic energy except that if you substitute the actual typical histories – or, equivalently, the operators for the electric and magnetic vectors – you will get the leading term that scales like that and diverges for \(\Delta t \to 0\).

In this case, it doesn't mean that the finite physical result cannot be obtained from a lattice calculation. It may be obtained. But you need to know what you're actually calculating. You need to know that your computer simulation is basically "overwhelmed by infinities" at every point but there is a clever "pattern in the infinities" or a clever way to subtract various infinities in such a way that the leftover resembles the "reality as we conventionally imagine it".

In the case of the energy density, the divergent piece is nothing else than the contribution of the harmonic oscillators' \(E_0=\hbar\omega_{\vec k}/2\) zero-point energies in the momentum space attributed to each point of the position space (or each lattice site). It can be subtracted. It's more natural to consider supersymmetric theories where bosonic fields and their superpartners, fermionic fields, produce exactly cancelling contributions to the zero-point energies. Supersymmetry is pretty and at least reduces the dominance of the unphysical infinities – but that's also why supersymmetry itself is at least "hard" on the lattice, too. The opposite relationship of physics and computer simulations to supersymmetry is just one major example of the fact that physical and computer-science principles seem to be in a strong tension against each other, to say the least.

Perhaps you could compare the generic situation in the discretization or simulation of the physical world to a film that is completely dominated by excessive brightness or by some very strong noise but that still allows you to subtract the brightness or noise in a clever enough way that allows you to see the ordinary movie hiding "somewhere" inside the seemingly unusable film. Yes, all these things – which Moshe calls post-processing – can be done but the user of the discretization or simulation must know what he should do and why. You may say that the user is nothing else than an observer in the quantum mechanical sense and observers have some cool talent to pick the physically relevant observables that have successfully jettisoned the unphysical divergent pieces.

The addition of all the divergent artifacts of the lattice isn't "physically natural" in any way – and the methodology to do these things is in no way unique. There are infinitely many ways to regularize a quantum field theory – and the diversity gets even more technical and wider because of the plethora of the "renormalization schemes" you may choose from – and we're never doing these things for the sake of the simulation itself. We're doing these calculations because of the result that all the simulations, discretizations, or renormalization schemes have in common.

In other words, all the specific additions of a particular discretization and simulation must be understood as garbage that we're not interested in and we want to throw it away. It's just totally wrong to assume that these artifacts of the regularization are "fundamental" in any sense.

I want to end with a reaction to the last paragraph of Moshe's first comment about the renormalization issues in a discretization:
So my point in all that is highlight that what you mean by simulation is different from just discretizing your model and taking the results as approximations to the true physical quantities. It is only this narrow definition of “simulation” which I think is incompatible with known low energy properties of the world. The full process, including post-processing, does give you finite approximation to physical results.
I agree with the statements as Moshe wrote them but I disagree about the relevance of the last one. To be more specific, I agree with him that the "simulation without the post-processing" cannot work at all, the "simulation with the post-processing" can be done (assuming that the chiral fermion, anomalies, and other technicalities won't stop you). But I disagree with the implicit suggestion that "because the simulation with the post-processing" is possible, the hypothesis that our Universe is a "simulation with post-processing" is viable.



Well, one of the conventional ways to argue so that one can save time is to embed one of the favorite Feynman videos. Here, in the video about the flying saucers, Feynman rightfully reminded us that the purpose of science isn't to say that things are possible or impossible all the time. Instead, science says that some things are more likely and other things are less likely. That's how the scientific approach operates.

The "simulation with all the post-processing" that Moshe basically claimed to be doable is indeed "possible". But what's more important is that as a physical theory, it remains extremely unlikely. The reason and the logic are absolutely analogous to the case of flying saucers that Feynman discusses in this video. Can you prove that it is impossible that there are flying saucers? Can you prove that it is impossible that we're living in a simulation?

No, I can't prove it but it's just very unlikely. (I would mock the intonation of the arrogant laymen – e.g. Aaronson – in the same way as Feynman did.) In the case of the simulation, it's unlikely because if someone writes a computer game (or shoots a movie), it's very likely that he won't deal with all the renormalization issues correctly, to preserve the agreement with the effective quantum field theory. After all, can you show me at least one Hollywood film director – and even one programmer of first-person shooting games – who can calculate quantum field theory in at least two renormalization schemes?

And the Hollywood folks' mastery of renormalization techniques in quantum field theory is getting worse, not better, so the technological improvements of computers aren't any helpful. For this reason, it's much more likely that if someone wrote a simulated world, it would only follow a caricature of the laws of Nature – much like catastrophic movies from the Hollywood only respect caricatures of the physical laws – and if it were so, we would be able to notice these violations of physics.

We haven't seen any which makes it extraordinarily likely and almost certain that our world is natural and not a simulation written down by anybody who at least remotely resembles the currently active programmers or filmmakers. Period.

The true face of feminism
26 March 2017 | 4:56 pm

Four days ago, The Harvard Crimson published a rant by its staff writer Miss Nian Hu,
Beware the male feminist
which sheds some light on the insane claims that the purpose of feminism is equality between sexes – instead of a totalitarian arrangement of the society that is or was analogous to the plans of Nazis, Islamists, communists, climate alarmists, and other -ists that pick a privileged part of the society and systematically terrorize (and sometimes exterminate) the rest. Superficially, the article is an attack on the male feminists – the pathetic would-be men who vote for Hillary, call themselves feminists, wear feminist T-shirts, encourage true, female feminists around them to whine, and think how this strategy could bring them advantages – which it sometimes does, mostly in socially putrified environments where the concentration of similar opportunists grows too high.

Needless to say, I don't find it existentially important to defend these male feminists for their own sake – I despise these spineless and despicable parodies of men about as much as I despise their female counterparts if not more so. However, what you can actually extract from Miss Hu's rant is primarily a snapshot of her views about the sexes and the character of the movement she considers her own. And maybe the spineless shameful opportunists could use Miss Hu's rant to figure out that their immoral strategy could ultimately be suicidal, too.




I am not urging you to waste your time with this whole worthless "essay" by an intellectually worthless author but it's obviously desirable to pick a few quotes to be sure that we know what we're discussing here:
...What these male feminists fail to realize is that, as men, they will always be oppressors...

Feminism does not need men. This simple statement alone will, no doubt, spark cries of misandry and male genocide. After all, in a world that caters exclusively to men, it is revolutionary to claim a space or a movement where men are not considered integral.

On the contrary, feminism is a radical and revolutionary movement that will upheave the status quo and remove men as the monopolizers of power. In general, people don’t like to lose power, especially when they’ve had it for so long. Feminism is not supposed to be palatable to men; it is supposed to be threatening.
The rest is pretty much repeating the same hateful remarks. To those who say that feminism is something that should be allowed in polite society, I must say: Please, ladies and gentlemen, give me a break.




What's shocking is that these insane calls to liquidate men aren't just a hobby of Miss Hu – who believes that she will be able to graduate in 2018. Her page reveals that her concentration is in Government and the secondary in Women, Gender, and Sexuality Studies. So she doesn't want to graduate just despite these absolutely unacceptable rants. She wants to graduate largely because of them.

Clearly, there has been no adult in the room at Harvard so far – otherwise this female would have been removed from Harvard for quite some time.

In principle, women can do most things that men can do. However, it was probably a net negative for Harvard to allow female students. There were several moments at which the feminization of the famous all-male Harvard College began. In 1872, Women's Education Association was founded outside Harvard, the 1879 "Harvard Annex" allowed women to study Harvard as appendices, the Radcliffe College was chartered in 1894, and classroom instruction was merged in 1943.

The female counterpart of this, for much time, all-male school was the Radcliffe College. Most people outside Massachusetts have never heard of it. The relative fame of Harvard and Radcliffe honestly reflected the relative importance of men and women for human activities that depend on the education and scholarship. In 1963, Radcliffe students received Harvard diplomas for the first time, a non-merger merger agreement was signed in 1977, and the unification was completed in 1999. I've already known the Radcliffe buildings merely as the home of the Radcliffe Institute for Advanced Study. The similarity of its name to IAS Princeton surely overstates the importance of Radcliffe.



Another video about the true face of feminism – mostly about men somewhere in the Latin America who enable feminists i.e. nude female savages to paint the men by feces and throw the urine and excrements to a cathedral that the feminists need to desecrate. I am sure that every decent enough human being – male or female – is disgusted by these feminists and agrees that any similar kind of a "war between sexes" that these loons are trying to ignite is terribly wrong.

One question is what is the average contribution that a female member of a scholarly community – that contains as many females as males – may actually contribute to the scholarship. Those who think that this number is (or ever was) close to 50% of the total scholarship are absolutely detached from reality and have abandoned the last traces of their common sense. But with a constructive approach, the efficiency of the institution would only decrease at most by a factor of two.

However, the situation may be much worse. As the video embedded above exemplifies, a wrong sub-community may make truly negative – and hugely negative – contributions to an institution such as a cathedral or a university or anything of the sort.

Let me return to Miss Hu's tirade against the male feminists. These people are ludicrous relatively to full-blown men but what Miss Hu fails to see is that these pathetic opportunists are still key for feminism, including the Nazi-style feminism preferred by Miss Hu. Men have created most of the valuable things in the history – including science, technology, culture, architecture – but they have been responsible for most of the bad things as well – which includes wars and harmful ideologies.

Whether you like this fact or not, feminism is only relevant because of the "work" done by some men, too. The feminist whining and constant and neverending claims to victimhood don't represent any actual power in isolation. A pathetic bitch who whines and claims to be oppressed all the time – even though everything she has ever achieved was purely because of affirmative action – isn't innately strong. She's only strong because some other people who actually control things, and they're overwhelmingly male, choose to pay lip service to the feminist junk.

So, dear feminists, the male feminists are pathetic but you're even more petty. You're just tiny little appendices attached to the petite penises of the male feminists. You can't make any revolution by yourself. You can at most annoy a man persistently enough so that he will prefer to turn himself into a tool that enables this inhuman ideology. But at the end, it's his decision, not yours, that matters.

When I make this comment, it's hard not to think about the Polish cult movie "The Sexmission" from the 1980s about an all-female totalitarian society that lives underground. Two males accidentally get there after a hibernation experiment takes longer than expected. There are lots of fun – and serious questions – addressed in the movie which was partly a satire about the communist regime. But one of the shocking developments is the finding that Her Excellency – the women's dictator – turns out to be male, the last male who survived a crisis. That clarification is applicable to the contemporary real-world feminism, too.



Pamela Geller chose the same title as I did in 2014. She discussed the fact that we know feminists who denounce video games but none of them would ever dare to do anything against the raping of thousands of girls in the U.K. by Muslims, among other things. This comparison shows the feminists' sickly twisted priorities or hypocrisy. But from another perspective, it also shows that they're not an actual political or physical power that would have to be considered in isolation. They can only fight very weak "enemies" – and only if these "enemies" largely allow them to do so.

Even though John Harvard is turning in his grave, the current Harvard University enables Miss Hu to perform her hateful crusade against men – and that's the main reason why she is still doing so. If at least someone who matters at Harvard had the decency, such things would stop happening almost instantly.


More News from this Feed See Full Web Site