Skip to main content

Project Mockingbird

First, I would like to apologize about this taking so long to come out; I have been trying to keep the facts accurate.  If you need a refresher on this series or just want to read from the start, begin here.

There is a conspiracy going on that the news is being controlled by the powers that be right now.  In these polarizing times with polarizing spins of major news organizations (MSNBC on the left, FOX News on the right, CNN on the TMZ side), this simultaneously seems intuitive (because of course they are) and also wrong (because there's more than one single spin perpetuated by popular news).  The point of this kind of media control is a version of mind-control of the masses, though not of the same variety I've covered before on this blog.

In 1973, there was a congressional hearing (that seems to be coming up a lot lately) which showed that the CIA had bought out a variety of reporters through bribes in order to release government-sanctioned news stories to the public.  In the fear of World War III, under the watchful eye of director of the Office of Special Projects Frank Wisner, 3,000 members of the CIA began a project to have 400 members of the media be on the payrole of the federal government in order to feed these journalists stories which are favorable towards the government ideology.

The goal of Project Bluebird was to utilize the trusted media in order to have the American public trust the Government more in the case of World War 3, especially given the fact that we were just coming off of World War 2.  Given the fact that there were already people who have lived through two world wars, the thought of a third one so soon was not a popular idea in the late 40's and the 50's.  So what better way to secure support than to have the media saying you can trust the media?  Ironically, this could have truly been called "Fake News" and "Alternative Facts".

In the 28 years from start until exposure, there were about 3400 people involved on both the CIA side and the media side, meaning that the minimum per-person-per-year risk of exposure is 7.3726E-06 (0.0000073726) percent per person per year.  For those of you keeping score, this brings the average to 1.3704E-04 and the standard deviation to 3.2084E-04.

So until next time, take that as you will.
K. "Alan" Eister Î”αβ

Relevant Entries:
First

Comments

Popular posts from this blog

Basic Statistics Lecture #3: Normal, Binomial, and Poisson Distributions

As I have mentioned last time , the uniform continuous distribution is not the only form of continuous distribution in statistics.  As promised, here are the three most common continuous distribution types.  As a side note, all sampling distributions are relative to the algebraic mean. Normal Distribution: I think most people are familiar with the concept of a normal distribution.  If you've ever seen a bell curve, you've seen the normal distribution.  If you've begun from the first lecture of this lecture series, you've also seen the normal distribution. This type of distribution is where the data points follow a continuous curve, is non-uniform, has a mean (algebraic average) equal to the median (the exact middle value), falls from highest probability at the mean to (for all practical purposes) zero as the x-values approach $\pm \infty$, and therefor has equal number of data points to the left and to the right of the mean, and has the domain of $(\pm \i

Confidence Interval: Basic Statistics Lecture Series Lecture #11

You'll remember last time , I covered hypothesis testing of proportions and the time before that , hypothesis testing of a sample with a mean and standard deviation.  This time, I'll cover the concept of confidence intervals. Confidence intervals are of the form μ 1-α ∈ (a, b) 1-α , where a and b are two numbers such that a<b, α is the significance level as covered in hypothesis testing, and μ is the actual population mean (not the sample mean). This is a the statement of there being a [(1-α)*100]% probability that the true population mean will be somewhere between a and b.  The obvious question is "How do we find a and b?".  Here, I will describe the process. Step 1. Find the Fundamental Statistics The first thing we need to find the fundamental statistics , the mean, standard deviation, and the sample size.  The sample mean is typically referred to as the point estimate by most statistics text books.  This is because the point estimate of the populati

The Connections Between the Sciences

I apologize for taking so long with this entry of my blog. I have been abnormally busy lately with my academics and poetry. Today, I am writing on how all of the sciences are related to one another, in the hopes that you will come to realize that the sciences are not as separate as popular culture and news has us believe. This blog will be geared to those individuals – weather you're the average person or a student of science, or a full blown scientist – who have the opinion that the different fields of science are completely isolated from one another. This sentiment is not true, and I hope to show the false-hood of this concept here. In physics, we have the concept of “The Right-Hand-Rule”. This pretty much determines whether the a force perpendicular to two vectors is “positive” or “negative”. Torque is a good example of this. The amount of torque placed on, say, a bolt by a crescent wrench is perpendicular to the position vector and the fo