WikiLeaks is a non-profit journalist website that focuses in releasing large sets of documents that would have otherwise been secret or unaccessible to the public.

There is a lot of controversy over the ethics of what WikiLeaks is fundamentally doing. Many have a polarized reaction to WikiLeaks, either loving or hating the notion of what they represent. On one hand, they are attempting to expose what is actually happening in the world, based upon factual data, rather than people’s opinions or the media’s spin. In a world where it is clear that at least 90% of the media has the interests of large corporations, it becomes difficult for many Americans to trust their TVs for reliable news.

Wikileaks promises to always publish documents in an unedited form, letting the truth speak for itself. Also, Wikileaks never reveals the identity of the person or people who provided the leak of information. This brings some concern for some, as it becomes difficult to know how to trust WikiLeaks sources. In particular, it seems that many in the establishment would not approve of WikiLeaks’ agenda.

Others may also dislike WikiLeaks by claiming that their organization may be poking around in areas that are too big for them to fully understand. WikiLeaks is often compared to the likes of Edward Snowden, which fundamentally addresses the notion of secrecy. Wikipedia takes a bit of a negative bias, as they state that many people are worried that WikiLeaks can “endanger innocent lives”.

Do we trust our governments to act in our best interest? If so, should we grant them the privacy of secrecy, or should the people demand a more transparent system that is not obfuscated in complications meant to confuse or bore the public so that they stay uninvolved?

Considering the recent leaks of emails from Hillary Clinton and the DNC, it seems pretty clear that these leaks are not endangering innocent lives, but rather exposing serious corruption in our democratic system. While people established in the democratic party may argue that WikiLeaks is trying to attack the political system itself, which may be counterproductive, many others do not look at it in the same light.

Many Americans are frustrated with the media coverage of Hillary Clinton, as it seems everyone with money is pushing for a Hillary victory. Others are also disgruntled with the way that Bernie Sanders campaign has ended, as many believe that the election was either rigged or unjustly influenced. Let’s dive into the emails themselves to learn what happened.

Directly from wikileaks.org, we see that they have released over 30,000 of emails between Hillary and various people.

On March 16, 2016 WikiLeaks launched a searchable archive for 30,322 emails & email attachments sent to and from Hillary Clinton’s private email server while she was Secretary of State. The 50,547 pages of documents span from 30 June 2010 to 12 August 2014. 7,570 of the documents were sent by Hillary Clinton. The emails were made available in the form of thousands of PDFs by the US State Department as a result of a Freedom of Information Act request. The final PDFs were made available on February 29, 2016.

On July 4th, WikiLeaks also released over 1,000 selected emails from Hillary’s archive that were pertinent to the Iraq war. This was done in anticipation of the UK’s release of their own 9/11 report, which stated that the UK was apologetic for the end result of what had happened in Iraq, but they were merely doing so to support their strong ally, the US.

Around this time, Julian Assange stated that there is enough information out there to indict Hillary Clinton, but that is impossible with Loretta Lynch in charge. Julian was exactly right, as the FBI did find wrongdoing of Hillary, but did not follow up with any charges.

There is clear evidence of Hillary lying about these emails.

Everyone in the media talks about the security of these emails, but few discuss the content of the emails. Perhaps the worst is that it seems Hillary Clinton single-handedly helped influence the UN to take out Gaddafi in Libya, destroy their dreams of a gold backed currency, and exploit the energy reserves of the country.

This type of action is just one example of many times the Clintons have exploited foreign countries for mining or energy resources, helping large Western corporations profit.

In an interview, Julian Assange said that the CIA warned Hillary that going into Libya would most likely lead to terrorist groups such as ISIS to take control of the country. Hillary did not care about this, and pushed the UN into Libya. If anything, it seems that Hillary is jeopardizing the safety of Americans by allowing regime changes that will support the growth of terrorism.

With the UN now pushing nukes up to Russian borders and conflicts in Syria and Turkey, tensions are starting to rise and many wonder if Hillary could influence the return of a second Cold War.

Less than 10 days ago, WikiLeaks is at it again, with almost 20,000 emails from the DNC. WikiLeaks had this to say about the documents.

Today, Friday 22 July 2016 at 10:30am EDT, WikiLeaks releases 19,252 emails and 8,034 attachments from the top of the US Democratic National Committee — part one of our new Hillary Leaks series. The leaks come from the accounts of seven key figures in the DNC: Communications Director Luis Miranda (10770 emails), National Finance Director Jordon Kaplan (3797 emails), Finance Chief of Staff Scott Comer (3095 emails), Finanace Director of Data & Strategic Initiatives Daniel Parrish (1472 emails), Finance Director Allen Zachary (1611 emails), Senior Advisor Andrew Wright (938 emails) and Northern California Finance Director Robert (Erik) Stowe (751 emails). The emails cover the period from January last year until 25 May this year.

It is clear that this is just the first release of a series of many to come. One source claimed that there were 6 releases planned to happen sporadically leading up to the election. What is interesting is the particular timing of these emails. Assange specifically said in an interview that it is simply a difficult task to get all of this material sorted and indexed. He set it as a goal to get the first set out before the DNC national convention, as he thought many people would be disappointed if he had released it later than this.

But what if these emails could have been released before the primaries? This information clearly would have helped Bernie Sanders. With more scandals coming out about Hillary, and with a delayed rape charge against Trump, it is now becoming difficult for Americans to trust either party.

Debbie Wasserman Schultz, the head of the DNC was forced to resign, which is interesting, considering that she was not one of the people with the most emails released. It seems she has taken a bullet for the DNC, making it seem like she is accepting all of the responsibility for the wrongdoing of the emails. Hillary responded by hiring DWS for her campaign within a couple of days.

So what did the emails show? First, it is evident that there were discussions to undermine Bernie, and specifically there was an email directed to attack Bernie on the topic of his religious beliefs.

The DNC followed up at the convention by apologizing to Bernie, but obviously Hillary was still nominated.

These emails also address the media’s response to WikiLeaks’ last leak of Hillary’s emails. Morning Joe had been ripping her to shreds for lying, and redditors were going wild.

The emails also specifically showed that they intervened with the media, in particular by telling Morning Joe to stop hating on Hillary. People on Reddit also pointed out that they noticed that their political coverage had changed exactly around the time of the DNC email to Morning Joe, showing that Morning Joe did listen to the DNC.

The media attempts to discredit the WikiLeaks documents, as they claim that the source is from Russia. WikiLeaks states that they authenticate all of their released documents, ensuring that they are indeed accurate. Assange also goes to say that multiple groups have hacked the DNC. Even if the Russians did hack the DNC, who cares where they came from, if they are indeed all factual. Snowden said that if Russians hacked the DNC, NSA would clearly know about this.

DWS in an email on May 21st already considered Bernie out of the race, when there were still 9 states who had not had their primary elections yet.

DWS also interfered with CBS media by appearing on “Face the Nation”, as she was quite pro Hillary in email sentiments leading up to this appearance.

“It’s clear that Bernie messed up and that we’re on the right side of history,” Miranda wrote in another bullet point, referring to the Nevada convention.

The above quote truly shows the bias of the DNC against Bernie and for Hillary. The DNC is supposed to support all of their candidates equally.

Politico reporter also emailed the DNC of their reports before they were published, which violates fundamental principles of journalism ethics.

The DNC also posted fake jobs on craigslist, implying that they would work for Trump’s campaign and subject them to humiliating acts. They were trying to fabricate dirt to throw on the Trump campaign.

The DNC also appointed big donors to government positions, showing how just about everything in politics can be bought.

WikiLeaks did expose big donors’ personal information, including bank account numbers and social security numbers.

The DNC also were making fun on some black person’s name.

People working within the democratic party were told to not to even look at wikileaks.com, as their site would contain dangerous malware. Julian Assange specifically said that there is no such possible risk. Even though the emails that they downloaded may contain malware, these malicious files would not get transferred over to the wikileaks.org website when they host the text copy of the email. Anyone who understands basic computers, has used emails, and has viewed websites should easily understand this.

It is very clear that the DNC does not want their own people to look into its internal affairs.

Remember, Hillary blatantly lied multiple times by saying that she had no private email server, no classified communications, or anything of the likes. With WikiLeaks releasing these documents, it is proof that Hillary is a habitual liar.

Facebook has also been blocking Assange’s releases, as they did not let WikiLeaks post material, claiming it was due to technical difficulties.

Google has been not showing the result “Hillary for prison”. It was alleged to be a top search term on Google. Bing also would show “Hillary for prison” as the first keyword auto-complete suggestion after typing in Hillary, yet Google did not show that keyword. Even to this day, it will suggest “Hillary for prison sticker” over “Hillary for prison”. There is no way people are Googling “Hillary for prison sticker” more than “Hillary for prison”, which suggests that Google is blocking this search term.

It seems that not only has the traditional media been compromised, but also social media and search engines.

The Washington Post had just about nothing interesting to say about the DNC leaks. This Bilderberg media company would not like to dive too deep into Hillary.

Julian Assange is not done with Hillary. It appears that this DNC is the first of six releases. Assange has also stated that he will put out enough information to allow for the arrest of Hillary Clinton.

Why did Debbie Wasserman Schultz get fired? Did she specifically say bad things in the emails, or did she simply take a bullet for all of the DNC? The wound from this metaphorical bullet was soon healed with a new position on Hillary’s campaign.

Why is it so difficult to find specifics from the emails in the media? The media is known for having democratic values and liberal interests. Is it possible that the media is purposefully not discussing the details of the emails? Googling around for 30 minutes and skimming through 10 mainstream news articles seems to provide very little information about what actually was released in the emails. They simply state that emails were released. They talk about Trump’s reaction, or the fact that DWS got fired, but they don’t seem to focus too much on what is actually in the emails.

After all, DNC vice chairwoman had been a CNN political commentator. Now that she is filling the shoes of DWS, she has cancelled her contract with CNN. The fact that the vice chairwoman of the DNC was working for CNN already is suspicious.

]]>A few months ago, I started searching for information about the **Bilderberg group**. It is alleged that they essentially pick presidents, and a lot of other crazy Illuminati new world order conspiracy bullshit.

And then I kept looking for more credible, *factual information. There essentially is none. *

A logical conclusion would be that a bunch of crazy people are just saying crazy things. But… people don’t just start talking about something for NO reason. Something must have triggered it. Let me put it this way, the Bilderberg Group does exist, right? What is its purpose then?

It’s just kind of… funny that no real media attention would be given to this meeting. Maybe it even makes some of you realize how corrupt global media is? The fact that mainstream media doesn’t pick up on this story is quite puzzling.

But what if every major media company is “indirectly” owned by the Rothschilds? Then it all starts to make sense. Okay, maybe it’s not literally the Rothschilds that own every media company, but maybe there are 6 really rich people who do…

(Another quick anecdote: Dr. Steven Greer shared information with a mainstream news reporter about similar secret activities. The reporter got super excited, as she thought this would be huge to break to the public. Dr. Greer told her that her boss wouldn’t let her do the story. She thought she was hot shit and could say whatever she wanted to the world. She was wrong… I’m not saying that aliens have visited Earth, but Dr. Greer has gotten dozens of ex military and government personnel to say that they would testify under Congress that they saw or knew about projects involved with aliens.)

I hereby acknowledge that none of the above text represents my opinion, as I have not yet made a full opinion on this topic. I am simply looking for the truth about how this world works, and attempt to model the truth with a superposition of beliefs of myself and others.

Today’s world is dominated by data. There is labeled data, or “data that is known to be true”. And there is unlabeled data, or “data that we are unsure about the validity”. (That’s just how I define it for this discussion) In an era where we have nearly unlimited data (or at least that’s what our cell phone plans say), it seems obvious to always trust the labeled data. So why would I be interested in conspiracy theories, which inherently seems to have no labeled data?

A fundamental flaw that many humans make is to assume that unlabeled data is labeled data. I made this mistake when I was a naive undergraduate, because I believed everything that I was told from my “smart professors”. This isn’t a problem, unless the data is actually false. So we get by okay, as long as we work together to reject all of the bad unlabeled data. Perhaps the scientific method is an algorithm for acquiring labeled data.

I soon realized that many intellectuals trust this labeled data so much, that they hold it to their grave. They refuse to even listen to unlabeled data. It makes logical sense. Why would I even read unlabeled data if I could choose labeled data? Well, the point is that no labeled data is truly labeled data, we just assumed it was…

Once I realized this, I suddenly stopped caring about subjects that were abundant in labeled data. Solving problems with labeled data is not interesting. It is easy. You simply do the research and regurgitate all of the information. At no point do you need to do any critical thinking. It seems that the true process of learning is not collecting labeled data, but converting unlabeled data to labeled data.

When you look at it this way, fundamentally, the most difficult questions to answer are ones in which all of the known data is unlabeled. How can anyone make sense out of all of this junk? These smart people who have prided themselves on memorizing a collection of labeled data immediately fail at answering these questions… Think about it this way, anyone can write a computer code to take a shitload of labeled data and deduce something useful with it. What about a shitload of unlabeled data?

I think I have demonstrated that the following question may be one of the most difficult questions to answer with any confidence. Does difficult imply interesting? Well, maybe it’s important…

As a consequence of the **first principle**, all members must make conscious and rational decisions which take into consideration not only their own desires, but the *desires of everyone in the global community*. __For example__, if I am a CEO of a company which harvests honey from bees, I must make a conscious effort to understand the consequences of all of my possible business decisions. While personal success is still a primary motivation, the first principle challenges me to make sure that I do not take away from the potential opportunity for future growth in society. For example, it would be globally irrational to exploit the bee population to make more money in the present moment. By not recognizing that my greedy business decisions could potentially affect the global ecosystem and economy, I am violating the first principle. The first principle does not dictate anything about which decisions should be made, but rather demands that a member of the Movement takes the time to consciously consider the implications of their actions, rather than hastily making decisions.

The **second principle **states that *all members must attempt to make decisions* which __benefit themselves and the global community__, when possible. Therefore, each citizen of the Movement is required to take social responsibility for their actions, even if the outcome of their actions cannot be fully predicted or determined. For example, if I am still the honey manufacturer CEO, the second principle requires me to act in a way that sustains the global honey supplies, as this would be beneficial to all members of society. Therefore, the second principle can affect the decision making process. If no solution can be found which seems to benefit the individual member and the global society, then the member is free to make transactions which benefit themselves and may potentially harm members of the global society.

I invite you all to join me in attempting to create a sustainable society which is globally conscious.

]]>Essentially, TPP allows for corporations to act as government bodies, in certain instances. I’m sure it does a bunch of other ridiculous things that I don’t understand. This seems like the most important and obvious one to me.

Nobody reads Garrett Lisi’s short paper on the theory of everything. It is pretty clear that no citizen of US will read all of the TPP. So shut the fuck up about how TPP is going to ruin your life. You are simply scared because of uncertainty. Uncertainty of TPP.

It is that uncertainty which is what makes TPP so powerful. Nobody that wrote it knows how it will play out. Therefore, it is up to society to interpret TPP as a conscious entity, rather than to just shit on it from the start.

TPP is good because government is currently corrupted by corporations. Instead of continuing this corruption, TPP admits power to capitalistic corporations. This will be good for capitalism. It will cause growth. Even better, it gives powers to corportations that they already use. It’s similar to making drugs legal in society, giving the people ultimately more freedom.

People don’t vote, it doesn’t work. But people buy. They will always buy. You, YOU! have the choice to buy from whatever company that you want. If you want to destroy yourself, then you are more likely to buy a product that destroys the planet. I have a feeling the smartest people in the world don’t want to destroy the planet, so it won’t happen.

So basically, TPP forces evolution of the human race. From a Darwinian perspective, TPP makes a lot of sense, because it will force those poisonous consumers to slowly get poorer and die away.

So literally, all you need to do to save this planet is to put your money into companies that you trust. It is now a social obligation of all humans to understand where and how their products are made. If you do not support slave labor, then maybe you shouldn’t buy from certain companies.

]]>My current view is that the world has a classical (real) description, which can be further decomposed into a quantum mechanical (complex) description. My opinion is that all physical observables must be real (classical) numbers, yet the physical quantities may be described in terms of complex (quantum) operators. After studying QFT, it is clear that the quantum vacuum is real. However, all classical theories neglect this vacuum, which is known to have inherently quantum phenomena. It is as if we say classical mechanics is wrong because we leave out the quantum vacuum. The theories we have subtract out the quantum vacuum, which results in a theory which must be quantum mechanical and no longer allows for a classical or real description. We have a class of phenomena which are described as inherently “quantum”, but I am arguing that we should look at these as inherently “vacuum” phenomena. Whether we analyze the system with real numbers (classical) or complex numbers (quantum), they both should be an adequate description of everything that we observe, since everything we observe must be “real”, by construction.

In this post, I will review Nassim’s “outlandish” paper called “The Schwarzschild Proton”, which can be found here: http://hiup.org/wp-content/uploads/2013/05/AIP_CP_SProton_Haramein.pdf

The most popular refutation of such work is found here, which I will refer throughout as Up: http://azureworld.blogspot.com/2010/02/schwarzchild-proton.html

The whole point of the paper is to say that all objects which can be approximated as pointlike objects are actually black holes. While this sounds outlandish, it actually is quite smart. The whole point of a black hole is that no information can leave. Therefore, by stating that a proton is a black hole, you are integrating out all degrees of freedom of substructure. There is no need to talk about the quarks and gluons holding the proton together, because they are all within the event horizon and therefore should have no impact on dynamics. Therefore, it seems that Nassim has created a classical model of some aspects of the strong force from gravity.

Nassim then uses this and some simple classical gravity arguments to obtain nuclear properties without any use of the strong force. He models the proton as a sphere with a radius of the compton wavelength. He then argues that if two of these protons were to spin around each other at the speed of light, then this could be used to describe why nuclei are stable. I don’t think Nassim is trying to say that the strong force does not exist. He is simply noting that there may be some self-consistent model which integrates out the strong force.

Next, I would like to address the critiques of Up, the blog entry posted above. Nassim starts by saying the quantum vacuum has an energy density. Nassim is saying that if you take the volume of a sphere with the radius of a Compton wavelength of a particle, then you can access all of the energy from the vacuum (which is larger than the rest mass). Therefore, he defines a fundamentally new type of mass term in his theory. The Up post seems to completely not understand this at all. Their first complaint is that Nassim claims that the mass of the proton is much larger than what we measure in a lab. No shit, Nassim is including the energy from the vacuum as well. We already know that dark energy represents most of the energy of the universe, so why is this surprising? It is clear to us that there is hidden energy in the universe. We don’t even understand how mass is created in the standard model. We even state that the bare mass of all particles is infinite. Renormalization is the procedure used to make the mass finite.

So in some senses, Nassim’s theory is MUCH MORE REASONABLE then quantum field theory, which is a well established theoretical framework riddled with infinity problems. Note that the reason that we do not understand quantum gravity is because of these infinity problems. My immediate belief when I learned QFT was that there are no infinity problems in the proper description, and that the bare masses must actually be somehow all related.

In Nassim’s theory, he is saying that the bare mass is defined by the Compton wavelength and the vacuum energy density. This is quite a beautiful picture. It makes a lot of sense too, if you were to get infinitely close to an electron, you would no longer feel as infinite of a force as QFT would predict. Sure, it would still go to infinity, but not as fast!

Up continues to talk about Hawking radiation and argue that a proton with this mass would not be physical. They clearly are not understanding what Nassim’s picture of mass is. He never claims to try to find what the physical mass is, he simply works with the bare mass. All of our physical theories are based off of the physical mass values, so obviously you will get nonsense if you plug the bare mass values into equations that represent physical masses. He also makes this mistake when talking about stability. He also claims that this model does not describe the substructure of the proton. So what? He isn’t trying to. Physics is about coming up with approximate models. Obviously QCD is going to get a more correct result than Nassim’s theory, but Nassim is never invoking quantum mechanics! His theory is purely classical. It is very simple and intuitive.

The next incredible fact that Nassim finds is that if you calculate the bare mass of 1 proton, you get roughly less than the mass of the entire universe, which is not surprising, since most of the matter is contained in protons. So the potential energy (vacuum energy) in one proton equals all of the physical energy throughout the universe. This cannot be some coincidence.

I believe that Nassim’s results suggests that there is a finite amount of energy, originally stored in the vacuum. It seems that energy may be transferred from the vacuum into the physical world in discrete packets, which gives quantum mechanical phenomena. His theory suggests that we may be able to find a QFT with finite bare parameters. Once we reach a distance where these bare parameters would be exceeded, it is suggestive of a symmetry breaking, or higher energy physics.

Let’s dive deeper into Up’s 6 main critiques:

- Up admits that the quantum vacuum calculation is approximate, then argues that Nassim should have gotten a closer result for comparing gravity and the strong force. He argues that the value of the energy density could be off by many orders of magnitude, which is true. He then says that Nassim should get a result closer than the 2 orders of magnitude he is off by. Are you serious? Obviously Nassim isn’t going to get it exactly right, you just explained why in your own complaint. It is clear that his skepticism is guiding him, rather than logic.
- Nassim seems to think that there is no strong force. I agree with Up here, there is a strong force and we clearly measure it’s results. I think Nassim is a bit confused here. I also don’t know why he calculates the force between two protons. He should have known that this would have misguided a lot of people. I think he is arguing that we have the potential to create devices with more force than we currently realize. Imagine if an EM drive was made using these forces? I think Nassim is making a good point to recognize that gravity and the strong force must be related, so if you were to integrate out all of the strong force from your theory, then the gravitational fields would get renormalized to describe these phenomena.
- Up claims Nassim gets a result that disagrees with special relativity. Nassim does not use special relativity, he uses classical mechanics to calculate the speed. Nowhere is Nassim required to agree with the result of special relativity. He is using bare mass, not physical mass. Therefore, everything is different. It is clear that Nassim is thinking of the proton as closely packed spheres. Since this is a new concept, it would not be surprising if he did make a mistake by a factor of 1.5 or something, you could just change the way you are envisioning the spheres to match. The whole point Nassim is making is that he calculates the speed of light from this gravitational effect. He is getting a special relativity number from a classical theory, which is quite remarkable.
- Up seems to be agreeing with Nassim, yet is so frustrated that he claims that it is an obvious result. Up simply takes the length scale and divides by c, but Nassim never assumes c, it popped out of his theory, as explained above. Remarkable.
- Up skips the most conclusive result of Nassim’s work. He shows that all systems in physics roughly follow this linear trend, and the Schwarzschild proton agrees with this trend exactly.
- Up also skips talking about this. Nassim admits that his theory is a classical approxmation and gets a reasonable result. He is off by exactly 2.25, which is 9/4ths. Notice that Up complained before about a factor of 1.5 in bullet point 3. Maybe it is no coincidence that 1.5^2 = 2.25, suggesting a small error in Nassim’s work?

It is my goal to start a free energy revolution, much akin to the industrial revolution. If particles can borrow energy from the vacuum, then so can humans! Perpetual motion is possible without violating the first law of thermodynamics.

In conclusion, I think it is such a shame that the scientific is so closed minded that they immediately disregard any model that doesn’t align perfectly with their already established models. Nassim is addressing an actual serious problem with all QFT. In QFT, we say that the bare charge and mass of all fermions is infinite, which seems absurd. Nassim does seem to have some incite on the vacuum and what it means to exist without including the quantum fluctuations occurring in space around the particle. Everything we measure about the electron is not associated with the electron, but rather the shielding provided by the vacuum. In some senses, we don’t know what the electron is like at all in a laboratory. We only understand the result of its interactions with the vacuum. Nassim seems to have a better understanding about what the bare proton is better than all physicists. He clearly doesn’t know as much about the interactions, QCD, etc, but we can forgive him of that.

]]>For those who don’t know, the EM drive is a RF resonant cavity thruster device proposed by Robert Shawyer. Essentially, it is a closed cone-shaped hunk of metal which has microwave radiation (light, or photons) bouncing around it. I like to picture it as a photon gas enclosed to the shape of a cone. The device is therefore powered by something similar to a microwave oven, and as a source of energy.

The source of energy is not the problem, but rather the source of momentum. Any theory of electromagnetism, whether it is classical or quantum mechanical, will have the notion of radiation pressure. The photons carry momentum, and therefore can cause a pressure, which is the average force per area. In the classical equation for radiation pressure, there is the speed of light, which is typically considered to be constant.

Perhaps you have read one of my posts on the Scharnhorst effect. It states that boundary conditions, such as the shape of a cone, act as dispersive medium. Essentially, if you are in empty space and are inside a cone, it is as if you are not in a vacuum. This is a very confusing point in quantum field theory, and I will attempt to describe it in layman’s terms.

Roughly speaking, in quantum electrodynamics (the QFT of electromagnetism), you state that classical mechanics is a good approximation, but not the exact solution. Whenever you calculate anything, you can find the “tree level amplitude”, which gives back the exact solution in classical electromagnetism. The power of QFT allows you to calculate “one loop amplitudes”, which are essentially quantum corrections. These quantum corrections represent scattering of the classical particles with the quantum vacuum.

To say the least, if you grew up your whole life learning classical mechanics, you will be very confused by these “virtual particles”, which represent interactions with the quantum vacuum and are only in loop amplitudes, not tree level. Basically, a photon can “briefly turn into” a virtual electron/positron pair and then turn back into a photon later. The one loop Feynman diagram for this process is shown below.

The wiggly line on the left represents a photon coming in, while the circle represents the electron/positron pair. We can see that the circle closes, and the photon is reemitted on the other side.

If we think of Feynman diagrams in position space, we instantly realize how a cone might be able to affect 1-loop calculations. The way I envision it is that on the larger side of the cone, there are more 1-loop diagrams contributing to the process, since you could fit bigger circles in that area without being cut off. However, if you are near the pinch of the cone, the vacuum diagram contributions become less and less.

This analogy is where many physicists may start to disagree with me (including the professor who taught me QFT at MIT), but this was my first analogy for why 1-loop diagrams affect the speed of light. Essentially, the photon always moves at some very fast speed, perhaps even at an infinite speed, or maybe larger that the speed of light that we classical view. It is the virtual state that slows down the particles, since they are now massive electron/positron pairs. This analogy has to be wrong in some sense, since the virtual particles are off-shell and not on-shell, but I don’t want to open this can of worms. In short, it seems conceivable that if I have more 1-loop diagrams, perhaps they slow the actual particles down.

I remember a time where I tried to explain this to my professor, and he utterly disagreed with me. I then told a friend, and he showed me the work of Scharnhorst, which has the mathematics to back up my ideas. Essentially, the conceptual picture is right. If you have two metal plates really close together, light can move slightly faster than c, since you cut out vacuum modes. I have gotten a lot of ridicule whenever I try to describe this to anyone, because people instantly say it “violates special relativity”, which is hilarious, since this result of from QFT, which is inherently a theory of special relativity. No causality is violated.

So now I have hopefully convinced you that the speed of light is varying as I go up and down the height of the cone. It is very slight, yet varying. If the speed of light varies, than the radiation pressure varies, and this could cause an imbalance in forces, causing propulsion! Not so fast… Scharnhorst himself said Casimir plates would have a change in the speed of light by of order 10^(-40), so how could this possibly result in something we could measure in a laboratory today? I still have this confusion, which will be elucidated by the calculation I did and will briefly describe next.

If anyone who is familiar with statistical mechanics is still confused about what the quantum vacuum is, think of it as a quantum mechanical action reservoir. There was a moment where I thought I was about to discover this on my own, but then I remembered that I had skimmed through this paper a few months earlier: http://arxiv.org/abs/physics/0605068

If anyone is still hung up on how a conservation law such as momentum conservation could be broken, you must realize that it comes from the ambiguity in your definition of momentum in the first place. The simplest resolution is to say that the quantum vacuum has momentum, and therefore we are pushing off of the quantum vacuum. This statement is oversimplistic, however, and very subtle. Let’s also make the point that it is not as if this is the first conservation law that has been broken by a QFT. Anomalous behavior is one of the most interesting topics in QFT. Also, scale invariance is broken in QFT through renormalization.

My first attempt to describe the phenomenon was to first assume that the virtual particles must go on-shell. This means that p^2 = m^2, which is a thing in special relativity. In QFT, the virtual particles can take on any crazy value, including ones where p^2 is negative, which seems to be like an imaginary mass particle. My adviser at UCLA has developed methods that calculate 1-loop amplitudes from purely on-shell kinematics, suggesting that you should be able to find the solution from the on-shell case. As a rough guess, it seemed reasonable to assume that the only way such a crazy EM drive result could occur is if the photon energy density is great enough to create an electron positron pair.

So I started my first calculation, can you get a high enough photon energy density to excite this on-shell electron/positron pair over the volume of the whole cone? I got into some deep discussions with experimentalists about this. Both of them concluded that we did not have enough power, but would in a couple decades. I actually disagreed with these professors on a technical detail, which gave me a lower value of the power. Okay, so I found a way to argue that maybe there is a high enough energy density to create an on-shell pair, but how much force would be generated?

The next thing I did was attempt to calculate the speed of light. Assuming microwave frequencies and reasonable laboratory sizes, I realized that you would need approximately 10^11 photons all interacting at once. This is quite a difficult calculation! I have calculated 2 photon scattering, and maybe I have written code to calculate 5 gluon scattering, but never in my life have I even attempted 10^11 particle scattering. I came up with a very very crude approximation for the speed of light. Roughly speaking, the photons move at c when they are in the “classical state”, or are not scattering with the quantum vacuum. The way I looked at this was as a two state system. There exists the classical state (speed c), and the 1-loop quantum state (speed 0) The quantum state had a speed of 0, since we will assume that the electron/positron pair will come to rest in the cavity for a brief moment of time. The final speed is therefore the speeds times the probabilities of each state occurring, so we could see that if the quantum state has a larger probability, then the photon’s effective speed of light will slow down.

Okay, so I have this crude approximation for the effective speed of the photon, but how do I calculate the probabilities? Well… I don’t… This is harder than anything I have ever imagined. I can say this, however. Since there are 10^11 photons interacting, there will be 10^11 vertices on the 1-loop diagram. With each vertex, you have to multiply by a factor of 1/137, which is known as the fine structure constant. So, I know that the probability of the quantum state is equal to something times (1/137)^10^11. This is a verrrrrrrrry small number. To top it off, the radiation pressure equation has a 1/c^3 dependence. The difference in pressure would be even smaller!

So I concluded YES! There does exist an EM drive! However, the pressure gradient generated is so small, it would be impossible to ever measure it in a lab.

I also want to stressed that at multiple points I made many assumptions TRYING to make the EM drive make sense. I assumed we had enough energy to put the particles on-shell. I even assumed that the speed of the electrons is zero, which is unlikely. I tried my hardest to get as large of a result as possible, and it still ended out tiny!

There are only two ways in which I could reconcile such an experimental result with my theoretical understanding:

1) Off-shell states happen frequently and affect things

2) The group velocity of light is somehow altered, but not by the Scharnhorst effect.

I encourage any questions, comments, concerns, refutations, corrections, etc! I want to get to the bottom of understanding if and how such a device is possible.

]]>Quantropy is mathematically: http://arxiv.org/pdf/1311.0813v2.pdf

Physically, it is required in a quantum mechanical path integral for complex systems such as computers or humans with memory.

]]>Assume electric charge warps spacetime. But not very much. If this is true, then F=kqq/r^2 breaks down classically because we changed r. The typical calculational procedure says q is renormalized, which is an error in our philosophical framework. This error gets propagated from microsystems to macrosystems. Information is destroyed, due to the quantum mechanical nature of “superrotations”, the generators of quantum rotation. Once this happens, we have introduced bad infinities into our theory. If the fabric of spacetime is removed, then no ambiguities or infinities are ever brought into the system. If you do not remove spacetime, it almost seems that MOND ideas may be sensible or that energy conservation breaks down. BMS Symmetry preserves this classical notion of energy and momentum and proves it is well defined. However, it neglected the dynamical nature of rotations from it’s studies, which is inherently quantum mechanically driven fluctuations.

But wait! There should be a conservation of infinities. Where did this divergence go? We choose a path integral which has integrated over all of spacetime, leaving only the quantum mechanical Scri of Penrose. Now that we have integrated over all of spacetime, we must now integrate over an infinite number of particle collisions and include that in our path integral. With entanglement, quaternions, Chris Doran’s geometric algebra, and Nima’s passion, we can reformulate QFT sensibly for gravity, as we can have computer simulate gravitational scattering of a few particles. As long as we take the analytic structure of these few particles, rather than an infinite number of particles, we can transfer information without loss of anything. I conjecture that if you integrate over 4 dimensions of spacetime, then you can only be right if your theory uses gravitational scattering of 4 or less gravitons. If you are 100% certain that more than 4 gravitons would never interact, then you can pretend spacetime is always essentially flat and renormalize everything okay using dimensional regularization. The reality is that physicists are entities which exist in an infinite dimensional realm.

This motivates the study of BMS symmetry and extending it rigorously quantum mechanically. The difficulty is constructing the method for performing this path integral over spacetime. It is an integral that destroys all information about distances between objects and only leaves the information of conformal infinity.

I claim that non-standard analysis could be taught to a computer, and a double copy of bits could be used to simulate a quantum computer. This could then be tested against experimental evidence, and we could simulate simple quantum gravity systems. In order to remove spacetime, we must throw away our current notion of a Lorentz transformation. I define a new type of transformation, which is a new type of fundamentally quantum mechanical and special relativistic theory.

Spacetime is ill-defined at short distances, so it makes most sense to integrate out the fabric of spacetime to actually know what we are talking about. It seems that the methods of Doran may help with rigorously defining this new method for solving the path integral. Think of it as integrating out spacetime, but still having momentum space around because life is complex. Now you can integrate over momentum space and not get infinities, because there is no longer any reference to the fabric of spacetime, no distance is ever referred to.

It seems that in order for the BMS symmetry to be real, we might need to make the assumption that humans exist in an infinite dimensional realm, which is difficult to precisely define mathematically. Energy can be transferred through people by emotions. This illogical type of statement is actually logical, yet does not exist in a 4 dimensional universe. It only exists in infinite dimensional universes. Essentially, humans have the ability to control which dimension they are in. This changes our perception of reality in ways we cannot fully comprehend… yet!

]]>where is the electric potential, is the temperature, and is the Seebeck coefficient. is dependent on the properties of the material and . Metals have modest Seebeck coefficients and some semiconductors have larger coefficients. Newly discovered materials with higher coefficients would improve energy extraction efficiency from natural temperature gradients, such as the engine of a car or a refrigeration system. Fully understanding the Seebeck effect in the low gradient limit is essential for optimization of such designs.

The CalTech paper (http://scitation.aip.org/content/aip/journal/rsi/82/6/10.1063/1.3601358) finds a rather bizarre non-linearity in accurate measurements of the Seebeck effect. The non-linearity is simply an affine transformation, or shift by a constant . This constant is also dependent on the material and the temperature, which would change the Seebeck effect to this:

This seems rather unintuitive! Why would a voltage spontaneously be created if both sides were at the same temperature? Also, if one put a heat reservoir of the same temperature on each ends of a metal bar, it would create a small voltage of about 20 microvolts. This is not some random fluctuations or oscillations from side to side like Johnson or Shot noise, this is truly a DC measurement. Clearly, to describe this effect accurately, more physics is needed in understanding what determines this unknown constant .

What other possible occurrences in nature could cause this force? Perhaps gravity can play a role. It is weak, but this experiment is probing the smallest electromagnetic measurements currently possible. This is essentially a low energy, or IR “divergence”. Let’s think about how gravity could possibly effect temperature.

The first thing that may come to mind is the Unruh effect, which states than an accelerating object would observe black body radiation from the vacuum. Therefore, by the equivalence principle, the force of gravity on earth causes an accelerated frame with respect to the Unruh effect. So the closer one gets to the surface of the earth, the more radiation one would see, In other words, everything measured around it would seem to be hotter than it actually is. To disagree with a temperature gradient caused by gravity would disagree with fundamental principles of relativity.

However, this Unruh effect is so small that it would be about times smaller than a comparable Seebeck effect. It would be undetectable by modern voltage measurement resolution. So maybe gravity is not the answer for the mysterious . But wait, the Seebeck effect is a classical effect, so shouldn’t there be a classical thermogravity effect just as there is a classical thermoelectric effect?

This effect does not seem to be accepted by the majority of physicists. Fairly recent papers by Coombes and Laue in 1984 as well as Roman in 1996 argue why this thermogravity effect should not exists, as a microcanonical ensemble suggests that the stationary state is always at the same temperature throughout. After all, this is the definition of equilibrium. While this is true for the simplest of systems, I am skeptical if it can be proven that it must occur for any general system.

Measurements of the atmosphere show that there exists a temperature gradient which could be due to gravity. While the system is not stationary and highly non-adiabatic, meteorologists and weather analysts approximate the atmosphere to be an adiabatic system and use a classical analogue of the thermoelectric effect, except essentially using mass instead of electric charge in the coefficients. Experiments done by Roderich Graeff has measured this effect and uses the classical adiabatic equation to match fairly accurately with his findings.

However, the CalTech experiment shows that a voltage gradient is expected to be measured with thermocouples (which exploit the Seebeck effect) when there is no temperature gradient. Perhaps there is no temperature gradient and we simply have not understood this anomalous . Who is to say if we have measured a gravitational effect, or are noticing this bizarre non-linearity?

A simple experiment which I have thought of will tell us exactly how much of is due to gravity and how much is due to unknown physics. Imagine a metal bar which is held to have zero temperature gradient on both ends, just as was done at CalTech. If is measured to be isotropic, then it cannot be due to gravity. If (the voltage measurement) is zero when the bar is horizontal and a maximum when vertical, then it would suggest that gravity is a valid explanation. Furthermore, if the bar is rotated by 180 degrees, one would expect the voltage gradient to reverse signs if this was a gravitational effect. The final possibility is that gravity is a relevant factor, but not telling the whole story. This would occur if is some minimum non-zero absolute value when in the horizontal position and at a maximum absolute value when in the vertical position.

I plan to ask the researchers at CalTech to see if they have tried rotating their experiment by 90 degrees and noticed a difference. If is isotropic, then Graeff has not measured a gravitational effect, but rather an anomaly of the Seebeck equation in the low differential energy limit. If gravity does play a role, a quantum statistical derivation of the effect is the next logical step. I will start this pursuit by assigning the following.

Ideally, we would like to renormalize our equation in such a way to force it to be linear. Fundamentally, the true Seebeck effect should be linear. Instead of working with , it will be convenient to define an arbitrary physical field , such that . Note that in general, is dependent on the material properties and the temperature. Next, a Legendre transform on the potential is made: . Now, we have recovered a truly linear Seebeck effect which is dominated by electromagnetism for ordinary laboratory voltages:

But what is physically? If it is purely due to gravity, it should be something like , where is the gravitational potential and is some material and temperature dependent coupling constant relating the strength between the electrogravity and thermogravity effect. While does have units, it should have a value much less than 1 for the magnitudes typical lab experiments.

has basically been written down by Loschmidt in the 1860’s when he argued that gravity would cause a perpetual mobile of the second kind. Maxwell and Boltzmann shut Loschmidt down by saying this process was violating the second law. It is my personal opinion that both sides of this argument were not exactly correct. It seems that there is new fundamental physics lying beyond us. Perhaps this discrepancy will lead to new types of efficient engines which could be created at a microscopic level. Once this process is understood, technology could be advanced to magnify such an effect.

Loschmidt’s argument led to the classical equation which meteorologists use incorrectly for describing the atmosphere, which is non-adiabatic. The thermoelectric effect has a more rigorous quantum statistical mechanics derivation of the Seebeck coefficient. I am currently making some progress on pursuing an analogous quantum statistical derivation of , which would provide a theory for a class of materials to be experimentally verified.

]]>http://en.wikipedia.org/wiki/Thermoelectric_effect

After discussions with Prof. Fronsdal, it seems that there is experimental evidence which suggests that the typical Seebeck effect is non-linear. The typical equation is dV = -SdT, where dT is the temperature difference applied and dV is the potential difference produced. S is the Seebeck coefficient, which is dependent on the metal. The non-linearity presents itself by shifting the equation by a constant which is equivalent to a microvolt.

If gravity is included into this equation, it could fix this non-linearity seen in experiment. Furthermore, the weakness of gravity motivates why this would only cause a shift in the electric potential by a microvolt. So right now, I have written down an equation for the current density j, which is similar to Ohm’s law, but also includes a coupling to temperature and the gravitational potential .

is the current density. is a constant related to the Seebeck coefficient by . Also, is a new constant which I have introduced. There are quantum mechanical and statistical mechanical derivations of , so I figure learning these may give incite on how to derive .

Some questions I have include does depend on the material properties? Perhaps the mass density. Does this effect treat solids and gases differently? A solid is made up of microscopic massive constituents which cannot move, while a gas’s constituents can move. Insulators do not have free electrons and therefore do not show the Seebeck effect. Does this mean that solids will not have the gravitational effect, but gases will?

Tomorrow, I will meet with Prof. Fronsdal and see what he has to say about these thoughts and questions. I may learn that there are other people which have already been developing this theory. I am not sure how much other people have thought about this idea. I couldn’t find any similar explanation on the internet. If it is true, then Roderich Graeff’s experiments could confirm the validity of the generalized equation.

Furthermore, it seems that this electric potential (and perhaps gravitational potential) must contribute entropy to the system. This seems to be the only way to keep the second law of thermodynamics and have the entropy always be equal or increase as time progresses. Perhaps the true total entropy cannot decrease, but the way it is currently calculated can… I retract all statements made about total entropy spontaneously decrease. This seems to be the only way to resolve the controversy of D’Abramo’s thermocharged capacitor.

]]>