MischaFisher.com | Quantitative Economicshttp://mischafisher.com/2017-03-01T12:00:00-08:00Distributional Parameters2017-03-01T12:00:00-08:00Mischa Fishertag:mischafisher.com,2017-03-01:distributional-parameters.html<p>Part of my thesis involved modeling survival times against parametric distributions, such as the Weibull, log-logistic, and exponential distributions.</p>
<p>One of the fun aspects of distribution theory is seeing how different parameter specifications can make some distributions special forms of other distributions. For today's quick chart, a lead in to the subject by looking at how a couple of commonly used survival analysis parameters resemble one another (with the parameter specifications highlighted in the R code below).</p>
<div class="highlight"><pre><span class="nx">x_lower</span> <span class="o"><-</span> <span class="mi">0</span>
<span class="nx">x_upper</span> <span class="o"><-</span> <span class="mi">10</span>
<span class="nx">max_height2</span> <span class="o"><-</span> <span class="nx">max</span><span class="p">(</span><span class="nx">dexp</span><span class="p">(</span><span class="nx">x_lower</span><span class="o">:</span><span class="nx">x_upper</span><span class="p">,</span> <span class="nx">rate</span> <span class="o">=</span> <span class="mi">1</span><span class="p">,</span> <span class="nx">log</span> <span class="o">=</span> <span class="nx">FALSE</span><span class="p">),</span>
<span class="nx">dweibull</span><span class="p">(</span><span class="nx">x_lower</span><span class="o">:</span><span class="nx">x_upper</span><span class="p">,</span> <span class="nx">shape</span> <span class="o">=</span> <span class="mi">1</span><span class="p">,</span> <span class="nx">log</span> <span class="o">=</span> <span class="nx">FALSE</span><span class="p">),</span>
<span class="nx">dlogis</span><span class="p">(</span><span class="nx">x_lower</span><span class="o">:</span><span class="nx">x_upper</span><span class="p">,</span> <span class="nx">scale</span> <span class="o">=</span> <span class="mi">1</span><span class="p">,</span> <span class="nx">log</span> <span class="o">=</span> <span class="nx">FALSE</span><span class="p">))</span>
<span class="nx">ggplot</span><span class="p">(</span><span class="nx">data</span><span class="p">.</span><span class="nx">frame</span><span class="p">(</span><span class="nx">x</span> <span class="o">=</span> <span class="nx">c</span><span class="p">(</span><span class="nx">x_lower</span><span class="p">,</span> <span class="nx">x_upper</span><span class="p">)),</span> <span class="nx">aes</span><span class="p">(</span><span class="nx">x</span> <span class="o">=</span> <span class="nx">x</span><span class="p">))</span> <span class="o">+</span> <span class="nx">xlim</span><span class="p">(</span><span class="nx">x_lower</span><span class="p">,</span> <span class="nx">x_upper</span><span class="p">)</span> <span class="o">+</span>
<span class="nx">ylim</span><span class="p">(</span><span class="mi">0</span><span class="p">,</span> <span class="nx">max_height2</span><span class="p">)</span> <span class="o">+</span>
<span class="nx">stat_function</span><span class="p">(</span><span class="nx">fun</span> <span class="o">=</span> <span class="nx">dexp</span><span class="p">,</span> <span class="nx">args</span> <span class="o">=</span> <span class="nx">list</span><span class="p">(</span><span class="nx">rate</span> <span class="o">=</span> <span class="mi">2</span><span class="p">),</span> <span class="nx">aes</span><span class="p">(</span><span class="nx">colour</span> <span class="o">=</span> <span class="s2">"Exponential"</span><span class="p">))</span> <span class="o">+</span>
<span class="nx">stat_function</span><span class="p">(</span><span class="nx">fun</span> <span class="o">=</span> <span class="nx">dweibull</span><span class="p">,</span> <span class="nx">args</span> <span class="o">=</span> <span class="nx">list</span><span class="p">(</span><span class="nx">shape</span> <span class="o">=</span> <span class="mi">2</span><span class="p">),</span> <span class="nx">aes</span><span class="p">(</span><span class="nx">colour</span> <span class="o">=</span> <span class="s2">"Weibull"</span><span class="p">))</span> <span class="o">+</span>
<span class="nx">stat_function</span><span class="p">(</span><span class="nx">fun</span> <span class="o">=</span> <span class="nx">dlogis</span><span class="p">,</span> <span class="nx">args</span> <span class="o">=</span> <span class="nx">list</span><span class="p">(</span><span class="nx">scale</span> <span class="o">=</span> <span class="mi">2</span><span class="p">),</span> <span class="nx">aes</span><span class="p">(</span><span class="nx">colour</span> <span class="o">=</span> <span class="s2">"Logistic"</span><span class="p">))</span> <span class="o">+</span>
<span class="nx">scale_color_manual</span><span class="p">(</span><span class="s2">"Distribution"</span><span class="p">,</span> <span class="nx">values</span> <span class="o">=</span> <span class="nx">c</span><span class="p">(</span><span class="s2">"blue"</span><span class="p">,</span> <span class="s2">"green"</span><span class="p">,</span> <span class="s2">"red"</span><span class="p">))</span> <span class="o">+</span>
<span class="nx">labs</span><span class="p">(</span><span class="nx">x</span> <span class="o">=</span> <span class="s2">"\n x"</span><span class="p">,</span> <span class="nx">y</span> <span class="o">=</span> <span class="s2">"f(x) \n"</span><span class="p">,</span>
<span class="nx">title</span> <span class="o">=</span> <span class="s2">"Common Survival Analysis Distribution Density Plots \n"</span><span class="p">)</span> <span class="o">+</span>
<span class="nx">theme</span><span class="p">(</span><span class="nx">plot</span><span class="p">.</span><span class="nx">title</span> <span class="o">=</span> <span class="nx">element_text</span><span class="p">(</span><span class="nx">hjust</span> <span class="o">=</span> <span class="mf">0.5</span><span class="p">),</span>
<span class="nx">axis</span><span class="p">.</span><span class="nx">title</span><span class="p">.</span><span class="nx">x</span> <span class="o">=</span> <span class="nx">element_text</span><span class="p">(</span><span class="nx">face</span><span class="o">=</span><span class="s2">"bold"</span><span class="p">,</span> <span class="nx">colour</span><span class="o">=</span><span class="s2">"blue"</span><span class="p">,</span> <span class="nx">size</span> <span class="o">=</span> <span class="mi">12</span><span class="p">),</span>
<span class="nx">axis</span><span class="p">.</span><span class="nx">title</span><span class="p">.</span><span class="nx">y</span> <span class="o">=</span> <span class="nx">element_text</span><span class="p">(</span><span class="nx">face</span><span class="o">=</span><span class="s2">"bold"</span><span class="p">,</span> <span class="nx">colour</span><span class="o">=</span><span class="s2">"blue"</span><span class="p">,</span> <span class="nx">size</span> <span class="o">=</span> <span class="mi">12</span><span class="p">),</span>
<span class="nx">legend</span><span class="p">.</span><span class="nx">title</span> <span class="o">=</span> <span class="nx">element_text</span><span class="p">(</span><span class="nx">face</span><span class="o">=</span><span class="s2">"bold"</span><span class="p">,</span> <span class="nx">size</span> <span class="o">=</span> <span class="mi">10</span><span class="p">),</span>
<span class="nx">legend</span><span class="p">.</span><span class="nx">position</span> <span class="o">=</span> <span class="s2">"top"</span><span class="p">)</span> <span class="o">+</span> <span class="nx">theme_economist</span><span class="p">()</span>
</pre></div>
<p><img class="img-responsive img-center" src="/images/posts/2017-03-01-Distro.png" width="80%"/></p>Distributions and Their Parameters2016-03-01T12:00:00-08:00Mischa Fishertag:mischafisher.com,2016-03-01:distributions-and-their-parameters.html<p>Distribution theory </p>
<blockquote>
<p><em>In probability theory, the central limit theorem (CLT) states that, given certain conditions, the arithmetic mean of a sufficiently large number of iterates of independent random variables, each with a well-defined expected value and well-defined variance, will be approximately normally distributed, regardless of the underlying distribution.</em></p>
</blockquote>
<p>If you're an econometrics student, it's likely the first time you're exposed to the CLT is when discussing the OLS estimator in the context of testing hypothesis about the true parameters in order to form confidence intervals: when relying on asymptotics, the normality of the error distributions is not required because as long as the normal 5 Gauss-Markov assumptions are satisfied, the distribution of the OLS estimator will converge to a normal distribution as <em>n</em> goes to infinity. </p>
<p>The CLT is pretty neat and is given short shrift in the context of econometrics, so, here's a brief experiment one can perform in R to illustrate what happens as the theorem comes into effect.</p>
<p>Starting with the Weibull distribution:</p>
<div class="highlight"><pre>plot(sort(rweibull(10000, shape=1)), main="The Weibull Distribution")
</pre></div>
<p><img class="img-responsive img-center" src="/images/posts/2015-11-CLT_Weibull.jpg" width="80%"/></p>
<p>We then sample the means of the distribution in increasing replications, then draw histograms from the distributions.</p>
<div class="highlight"><pre>hist(colMeans(replicate(30,rweibull(100,shape=1))),breaks="Scott", xlab="Sample Means", main="Histogram for 30 Replications")
hist(colMeans(replicate(300,rweibull(100,shape=1))),breaks="Scott", xlab="Sample Means", main="Histogram for 300 Replications")
hist(colMeans(replicate(3000,rweibull(100,shape=1))),breaks="Scott", xlab="Sample Means", main="Histogram for 3,000 Replications")
hist(colMeans(replicate(30000,rweibull(100,shape=1))),breaks="Scott", xlab="Sample Means", main="Histogram for 30,000 Replications")
hist(colMeans(replicate(300000,rweibull(100,shape=1))),breaks="Scott", xlab="Sample Means", main="Histogram for 300,000 Replications")
hist(colMeans(replicate(3000000,rweibull(100,shape=1))),breaks="Scott", xlab="Sample Means", main="Histogram for 3,000,000 Replications")
</pre></div>
<p>Revealing, a very wonderful .gif :</p>
<p><img class="img-responsive img-center" src="/images/posts/2015-11-CLT.gif" width="80%"/></p>
<p>As the replication size increases, the histogram begins to resemble a normal distribution. Neat!</p>
<h3>UPDATE:</h3>
<p>A friend writes:</p>
<blockquote>
<p><em>You should emphasize more the process that you are taking the MEAN of sub samples of your 'population' which is Weibull distributed. Then, by creating a vector of these means, one is able to show that these "means" converge to a normal distribution as N approaches infinity. As it is, to the 'less' experienced reader perhaps will fail to realize that you take the means of sub samples of that pop'n, and then it is the means which become normally distributed.</em></p>
</blockquote>
<p>Good point; thanks Keith!</p>State Budgets and Populations (a.k.a Why Illinois is in the shape it's in)2016-02-03T00:00:00-08:00Mischa Fishertag:mischafisher.com,2016-02-03:state-budgets-and-populations-aka-why-illinois-is-in-the-shape-its-in.html<p>Given Illinois' current budget impasse, I thought it would be interesting to do a five minute analysis looking at how the size of each state's budget varies in proportion to its population. This is obviously a very shallow examination, and one could spends weeks digging through budget numbers, federal transfers, rural-urban splits, poverty and education levels, industry compositions, unfunded pension liabilities, worker's compensation costs, etc. etc. Still, summary statistics exist for a reason; and 5 minute analyses can be useful exercises.</p>
<p>Pulling data from Wikipedia's maintained list of US State budgets <a href="https://en.wikipedia.org/wiki/List_of_U.S._state_budgets">(here)</a>, and from the U.S. Census' estimate of 2015 state populations <a href="https://www.census.gov/popest/data/datasets.html">(here)</a></p>
<h2>All States</h2>
<p>First, a look at all U.S. States:</p>
<p><img class="img-responsive img-center" src="/images/posts/2016-02-Budgets_01.png" width="90%"/></p>
<p>I knew Illinois' budget has historically been unsustainable, but, I was surprised at just how much of an outlier the state is. Mercatus now has Illinois rated dead last in the country in terms of fiscal solvency <a href="http://mercatus.org/statefiscalrankings">(link here)</a>, and with this chart, one can see why!</p>
<h2>Just the Large States</h2>
<p>Comparing large states to large states, here are all the states with populations above 6 million people.</p>
<p><img class="img-responsive img-center" src="/images/posts/2016-02-Budgets_02.png" width="90%"/></p>
<h2>Just the Small States</h2>
<p>And in the same spirit, the small states:</p>
<p><img class="img-responsive img-center" src="/images/posts/2016-02-Budgets_03.png" width="90%"/></p>
<p>Here it's worth noting that Illinois does not have the largest per capita budget; that honor goes to Alaska. Illinois simply is the largest deviator from the overall trend line in absolute dollar terms. That being said, since State budgets include federal transfer dollars for federal programs (infrastructure, heating assistance, etc. etc.), it's not hard to see why Alaska, which has very few people in it but a lot of federally supported infrastructure, is the highest per capita budget state.</p>
<h2>UPDATE:</h2>
<p>What was intended as a five minute "hey that's interesting!" analysis ended up exploding on the internet. With over 100K page views in 12 hours, the response was certainly unexpected. On that note, a few people on Reddit mentioned they'd be interested in seeing the log of the data. So here it is:</p>
<p><img class="img-responsive img-center" src="/images/posts/2016-02-Budgets_04.png" width="90%"/></p>
<p>More importantly, I think it's important for people to remember that Wikipedia data is not always particularly accurate, nor is it necessarily an apples to apples comparison. Some of the data listed on the page is for single year periods, other bits of data are for multi-year periods. The script I used to plot takes those things into consideration, but, it may have errors given the inconsistency in Wikipedia's data. Time allowing I'll source better data and replot at some point in the future. </p>
<h2>UPDATE 2:</h2>
<p><a href="http://www.nasbo.org/sites/default/files/State%20Expenditure%20Report%20%28Fiscal%202013-2015%29S.pdf">NASBO</a> has a dataset on state spending, I took some time to manually write down the state and spending columns as vectors in R (so there may be transcription errors) and the log of the data produces a result <strong>very</strong> different than Wikipedia's data:</p>
<p><img class="img-responsive img-center" src="/images/posts/2016-02-Budgets_05.png" width="90%"/></p>
<p>So the story of Illinois' terrible fiscal condition could very well be more complicated than can be captured in a single graph. As in most things in life; the issue is complex and nuanced.</p>
<p>I will add though, the NASBO graph doesn't capture unfunded pension liabilities; so, it also cannot necessarily be taken at anything more than face value.</p>Renting vs. Owning in Chicago2016-01-23T00:00:00-08:00Mischa Fishertag:mischafisher.com,2016-01-23:renting-vs-owning-in-chicago.html<p>Moving to Chicago this past weekend prompted the age old question, should one rent or should one buy?</p>
<p>Independent of the qualitative and lifestyle differences between the two choices, I was curious how - strictly speaking - the finances between the two options worked out. Using the Case Schiller Price Index for Condos in the Chicago metro, and the historical return rate on real estate prices, worked into a short R function, produced the below results.</p>
<p>Worth noting, the calculation <strong>included</strong>:</p>
<ul>
<li>The opportunity cost of capital</li>
<li>Historical returns on real estate prices</li>
<li>For Owning: Mortgage, property taxes, closing costs, Home Owners Association fees, cost of ownership</li>
<li>For Renting: Rent, Utilities</li>
</ul>
<p>However, it <strong>did not include</strong>:</p>
<ul>
<li>Mortgage interest Deduction</li>
<li>Non linear mortgage amortization. (This one is linear, rather than being skewed towards the latter years as would actually happen on a traditional amortization schedule)</li>
<li>This assumes a 20% downpayment, rather than some other available options, such as the FHA loans that allow for as low as 3.5%. </li>
</ul>
<p>The Result:</p>
<p><img class="img-responsive img-center" src="/images/posts/2016-01-Housing_01.png" width="80%"/></p>
<p>Given these basic assumptions on relative costs, it seems to confirm the somewhat common folk wisdom that it takes about 4-5 years for the initial hit of the closing costs to be paid off by saved expenses and amortization of the mortgage loan. </p>How Much Does Rural Living Predict Broadband Speed?2015-12-08T12:00:00-08:00Mischa Fishertag:mischafisher.com,2015-12-08:how-much-does-rural-living-predict-broadband-speed.html<p>I was recently brushing up on the current status of the telecommunications industry in the United States, and I became curious about how much a state's rural population predicted its overall average internet connection speed levels.</p>
<p>Pulling data on average speeds from <a href="http://www.broadviewnet.com/blog/2014/08/internet-speeds-by-state-map/">(here)</a>, which sources Akamai's <em>State of the Internet</em> report, and data on the rural/urban population by state from Iowa State University's site <a href="http://www.icip.iastate.edu/tables/population/urban-pct-states">(here)</a>, reveals the below plot:</p>
<p><img class="img-responsive img-center" src="/images/posts/2015-12-Broadband_01.png" width="80%"/></p>
<p>While noticeable, the effect here is pretty minor overall. It would take a shift of about 20% of the state's population into an urban environment to shift average speeds up by a single Mbps.</p>
<h2>How About Globally?</h2>
<p>Curious about how well this predicts things generally, I did the same thing looking at global data. Wikipedia has a concise list of countries by internet connection speeds <a href="https://en.wikipedia.org/wiki/List_of_countries_by_Internet_connection_speeds">(here)</a>, and the World Bank maintains a time series list of urban/rural population <a href="http://data.worldbank.org/indicator/SP.URB.TOTL.IN.ZS">(here)</a>. </p>
<p>Pulling those two datasets together, reveals the following:</p>
<p><img class="img-responsive img-center" src="/images/posts/2015-12-Broadband_02.png" width="80%"/></p>
<p>Visually this looks similar, although the linear regression slope is a little steeper. It would take a shift of about 10% of an average country's population to shift the speed up by a Mbps.</p>
<h2>Conclusion</h2>
<p>At the end of the day, <em>this exercise is likely too simple to shine much light on the phenomenon.</em> Economies of scale in terms of population density would, all else equal, suggest that as density goes up, so would speed. But the 'average' speed by state hides the interesting variation that goes on within the county and city level that I imagine would more apparently show the swings based on not just population density, but, also regulatory factors such as the ease of installation and market entry, access to public conduit and utility maps, etc. etc. Looking at the data more locally would probably be more meaningful than trying to examine aggregate state or country data, because there is likely some regional smoothing that goes on in the 'average speed' metric that underestimates the effect.</p>A Brief Exercise Illustrating the Central Limit Theorem2015-11-26T10:20:00-08:00Mischa Fishertag:mischafisher.com,2015-11-26:a-brief-exercise-illustrating-the-central-limit-theorem.html<p>Succinctly, the Central Limit Theorem can be expressed as:</p>
<blockquote>
<p><em>In probability theory, the central limit theorem (CLT) states that, given certain conditions, the arithmetic mean of a sufficiently large number of iterates of independent random variables, each with a well-defined expected value and well-defined variance, will be approximately normally distributed, regardless of the underlying distribution.</em></p>
</blockquote>
<p>If you're an econometrics student, it's likely the first time you're exposed to the CLT is when discussing the OLS estimator in the context of testing hypothesis about the true parameters in order to form confidence intervals: when relying on asymptotics, the normality of the error distributions is not required because as long as the normal 5 Gauss-Markov assumptions are satisfied, the distribution of the OLS estimator will converge to a normal distribution as <em>n</em> goes to infinity. </p>
<p>The CLT is pretty neat and is given short shrift in the context of econometrics, so, here's a brief experiment one can perform in R to illustrate what happens as the theorem comes into effect.</p>
<p>Starting with the Weibull distribution:</p>
<div class="highlight"><pre>plot(sort(rweibull(10000, shape=1)), main="The Weibull Distribution")
</pre></div>
<p><img class="img-responsive img-center" src="/images/posts/2015-11-CLT_Weibull.jpg" width="80%"/></p>
<p>We then sample the means of the distribution in increasing replications, then draw histograms from the distributions.</p>
<div class="highlight"><pre>hist(colMeans(replicate(30,rweibull(100,shape=1))),breaks="Scott", xlab="Sample Means", main="Histogram for 30 Replications")
hist(colMeans(replicate(300,rweibull(100,shape=1))),breaks="Scott", xlab="Sample Means", main="Histogram for 300 Replications")
hist(colMeans(replicate(3000,rweibull(100,shape=1))),breaks="Scott", xlab="Sample Means", main="Histogram for 3,000 Replications")
hist(colMeans(replicate(30000,rweibull(100,shape=1))),breaks="Scott", xlab="Sample Means", main="Histogram for 30,000 Replications")
hist(colMeans(replicate(300000,rweibull(100,shape=1))),breaks="Scott", xlab="Sample Means", main="Histogram for 300,000 Replications")
hist(colMeans(replicate(3000000,rweibull(100,shape=1))),breaks="Scott", xlab="Sample Means", main="Histogram for 3,000,000 Replications")
</pre></div>
<p>Revealing, a very wonderful .gif :</p>
<p><img class="img-responsive img-center" src="/images/posts/2015-11-CLT.gif" width="80%"/></p>
<p>As the replication size increases, the histogram begins to resemble a normal distribution. Neat!</p>
<h3>UPDATE:</h3>
<p>A friend writes:</p>
<blockquote>
<p><em>You should emphasize more the process that you are taking the MEAN of sub samples of your 'population' which is Weibull distributed. Then, by creating a vector of these means, one is able to show that these "means" converge to a normal distribution as N approaches infinity. As it is, to the 'less' experienced reader perhaps will fail to realize that you take the means of sub samples of that pop'n, and then it is the means which become normally distributed.</em></p>
</blockquote>
<p>Good point; thanks Keith!</p>Donald Trump's Campaign Contributions2015-11-04T12:00:00-08:00Mischa Fishertag:mischafisher.com,2015-11-04:donald-trumps-campaign-contributions.html<p>Donald Trump recently passed the 100 day mark as the Republican front runner for the Presidential nomination, which means I was wrong (as many friends have reminded me) about him being a short term fad.</p>
<p>To scratch a curious itch, I decided to pull The Donald's campaign contributions from the <a href="http://www.fec.gov">FEC's personal disclosure database</a>; that is, where has the Donald donated his money over the years. Since I was wrong about how long he'd remain at the front of the Republican primary field, I thought I'd see if I was also wrong in my suspicion that his donations were skewed toward whichever party was more popular at any given moment over the last 20 years. To the data!</p>
<h2>Sum of all Donations, by Party</h2>
<p><img class="img-responsive img-center" src="/images/posts/2015-11-Trump_01.png" width="80%"/></p>
<p>This is the first breakdown of total contributions over the last 20 or so years, by donations to Republicans, Democrats, and himself. Unfortunately, it has no context of time since I suspected his donation volume had increased substantially in the most recent year.</p>
<h2>Sum of All Donations by Party, per year</h2>
<p><img class="img-responsive img-center" src="/images/posts/2015-11-Trump_02.png" width="80%"/></p>
<p>This breaks down the total value of all his donations by party, per year. While it shows the recent uptick in spending, it suffers from a form of political inflation too. Donation values, for normative and legal reasons, have increased substantially across the board in the few recent cycles. So, to correct for that, I also looked at the total number of donations made...</p>
<h2>Total Number of Donations by Party, per year</h2>
<p><img class="img-responsive img-center" src="/images/posts/2015-11-Trump_03.png" width="80%"/></p>
<p>This better captures how far and wide The Donald has spread his political largesse over the past 20 years by looking at the total number of donations made to each party (and to himself). </p>
<h2>Total Number of Donations by Party, per Election Cycle</h2>
<p><img class="img-responsive img-center" src="/images/posts/2015-11-Trump_04.png" width="80%"/></p>
<p>Finally, to remove the cyclical nature of donations, I clumped the years together into election cycles, since, off-year donations tend to be lower than on-year donations. </p>
<p>This to me is the clearest story about his donations. There is clearly a shift in the nature of his partisan giving; and that shift comes pretty close to when President Obama's popularity started to decline. And the spike in donations to Democrats came pretty much at the same time as the big drop in President Bush's popularity. </p>
<p>One could interpret these results in two different ways; first, Donald Trump is an opportunist who simply changed his giving to reflect a change in where he viewed an opportunity to run (specifically, look at the big change between the 2006 election cycle and the current one). Alternatively, one could also think he's a dedicated partisan who stopped his political giving because he grew sick of holding the middle ground. I think that second scenario is unlikely, but, then again I've been wrong before.</p>An Intuitive Explanation of the OLS Estimator for both Traditional and Matrix Algebra2015-10-19T12:00:00-07:00Mischa Fishertag:mischafisher.com,2015-10-19:an-intuitive-explanation-of-the-ols-estimator-for-both-traditional-and-matrix-algebra.html<p>The Ordinary Least Squares estimator, \( \hat{\beta} \) is the first thing one learns in econometrics. It has two forms, one in standard algebra and one in matrix algebra, but it's important to remember the two are equivalent:</p>
<h2>$$ \hat{\beta} = \frac{\hat{cov}(x,y)}{var(x)} = \mathbf{({X}'X)^{-1}{X}'Y} $$</h2>
<p>I think most students will find it extremely easy to get lost in notation and miss the link to be made with real world data. The following exercise is a helpful way I found to make sure one continues to make the link between traditional 'simple' notation, Matrix Algebra notation, and the underlying data and arithmetic that goes into the ordinary linear regression estimator. </p>
<h2>Deriving the Algebraic Notation for the Simple Bivariate Model</h2>
<p>The familiar simple bivariate model is expressed as an independent observation as a function of an intercept, a regression coefficient, and an error term (respectively):</p>
<p>$$ y_{i} = b_{0} + b_{1}x_{1} + e_{i} $$</p>
<p>Where we wish to minimize the sum of squared errors (SSE):</p>
<p>$$ minimize: SSE = \sum_{i=1}^{N} e_{i}^{2} $$</p>
<p>To do so we isolate the error of the regression to make it a function of the other terms:</p>
<p>$$ e_{i} = y_{i} - b_{0} - b_{1}x_{1} $$</p>
<p>Then substitute: </p>
<p>$$ minimize: \sum_{i=1}^{N} (y_{i} - b_{0} - b_{1}x_{1})^{2} $$</p>
<p>For our purposes, we'll ignore the derivation of the intercept and take it as a given that it is \( \bar{y} - \hat{\beta_{1}}\bar{x} \) and just solve for the \( \hat{\beta} \) slope coefficient. To minimize the errors, we need to take the partial derivative with respect to \( b_{1} \)</p>
<p>$$ \frac{\partial SSE }{\partial b_{1}} = \frac{\partial }{\partial b_{1}} \left [ \sum_{i=1}^{N} (y_{i} - b_{0} - b_{1}x_{1})^{2} \right ] $$</p>
<p>Move the summation operator through since the the derivative of a sum is equal to the sum of the derivatives:</p>
<p>$$ \frac{\partial SSE }{\partial b_{1}} = \sum_{i=1}^{N} \left [ \frac{\partial }{\partial b_{1}} (y_{i} - b_{0} - b_{1}x_{1})^{2} \right ] $$</p>
<p>Take the derivative (using the chain rule), then setting it equal to 0 for the first order condition to find the min/max:</p>
<p>$$ \frac{\partial SSE }{\partial b_{1}} = -2 \sum_{i=1}^{N} x_{i}(y_{i} - b_{0} - b_{1}x_{1}) = 0 $$</p>
<p>Then multiply by \( - \frac{1}{2} \) to simplify:</p>
<p>$$ 0 = \sum_{i=1}^{N} x_{i}(y_{i} - b_{0} - b_{1}x_{1}) $$</p>
<p>Substitute the solution for the intercept, \( b_{0} \), that we took as a given above:</p>
<p>$$ 0 = \sum_{i=1}^{N} x_{i}(y_{i} - (\bar{y} - \hat{\beta_{1}}\bar{x} ) - b_{1}x_{1}) $$</p>
<p>Then rearrange and distribute the summation operator to solve for \( \hat{\beta_{1}} \) :</p>
<p>$$ \hat{\beta_{1}} = \frac{\sum_{i=1}^{N} (y_{i} - \bar{y} )x_{i}}{ \sum_{i=1}^{N} (x_{i} - \bar{x})x_{i} } $$</p>
<p>Which is algebraically equivalent to:</p>
<p>$$ \frac{\hat{cov}(x,y)}{var(x)} = \frac{ \frac{1}{n} \sum_{i=1}^{n} (x_{i} - \bar{x})(y_{i} - \bar{y} )}{ \frac{1}{n} \sum_{i=1}^{n} (x_{i} - \bar{x})^{2} } = \hat{\beta_{1}}$$</p>
<h2>Deriving the Matrix Algebra Notation</h2>
<p>Despite typically not being taught until the senior undergraduate, or graduate, level: the derivation for the matrix notation is actually a little more straightforward (as long as one remembers the rules for matrix algebra (which I typically do not)). </p>
<p>First, visualize the linear model again, but this time in matrix notation where <strong>Y</strong> and <strong>e</strong> are vectors of the observations and <strong>X</strong> is the matrix of the independent variables and their observations:</p>
<p>$$ \mathbf{Y = XB + e} $$</p>
<p>Just as before, we want to minimize the sum of squared errors</p>
<p>$$ minimize: SSE = \mathbf{e'e} $$</p>
<p>Rearranging and substituting yields: </p>
<p>$$ SSE = \mathbf{(Y - XB)'(Y - XB)} $$</p>
<p>Push the transpose operator through:</p>
<p>$$ SSE = \mathbf{(Y' - B'X')(Y - XB)} $$</p>
<p>Multiply the whole equation out:</p>
<p>$$ SSE = \mathbf{Y'Y - Y'XB - B'X'Y + B'X'XB} $$</p>
<p>Simplify the two equivalent terms in the middle:</p>
<p>$$ SSE = \mathbf{Y'Y - 2Y'XB + B'X'XB} $$</p>
<p>Then again as before we'll take the partial derivative for the first order condition:</p>
<p>$$ \frac{\partial SSE }{\mathbf{\partial B}} = \frac{\partial }{\mathbf{\partial B}} (\mathbf{Y'Y - 2Y'XB + B'X'XB)}$$</p>
<p>And set to 0 to find the minimum:</p>
<p>$$ \frac{\partial SSE }{\mathbf{\partial B}} = \mathbf{-2X'Y + 2X'XB} = 0 $$</p>
<p>Then isolate \( \mathbf{B} \) and simplify:</p>
<p>$$ \mathbf{B} = \mathbf{(X'X)^{-1}X'Y} $$</p>
<h2>Getting Real World Data</h2>
<p>Pulling data from <a href="http://www.autotrader.com">autotrader.com</a> for Honda CR-V's, I came up with 9 observations across three simple variables: Price, Mileage, and Year. </p>
<table>
<thead>
<tr>
<th><div class="text-center"> PRICE </div></th>
<th><div class="text-center">YEAR</div></th>
<th><div class="text-center">MILEAGE</div></th>
</tr>
</thead>
<tbody>
<tr>
<td>$19998</td>
<td>2012</td>
<td>16568</td>
</tr>
<tr>
<td>$16995</td>
<td>2011</td>
<td>68399</td>
</tr>
<tr>
<td>$22491</td>
<td>2013</td>
<td>23813</td>
</tr>
<tr>
<td>$27571</td>
<td>2014</td>
<td>15156</td>
</tr>
<tr>
<td>$25998</td>
<td>2014</td>
<td>17201</td>
</tr>
<tr>
<td>$24000</td>
<td>2012</td>
<td>28946</td>
</tr>
<tr>
<td>$15495</td>
<td>2010</td>
<td>87440</td>
</tr>
<tr>
<td>$13290</td>
<td>2007</td>
<td>83060</td>
</tr>
<tr>
<td>$8449</td>
<td>2006</td>
<td>153549</td>
</tr>
</tbody>
</table>
<p>Using car price as a function of car year in a simple bivariate model, one can find the OLS slope coefficient in R using the simple call:</p>
<div class="highlight"><pre><span class="kp">options</span><span class="p">(</span><span class="s">"scipen"</span><span class="o">=</span><span class="m">100</span><span class="p">,</span> <span class="s">"digits"</span><span class="o">=</span><span class="m">4</span><span class="p">)</span>
price <span class="o"><-</span> <span class="kt">c</span><span class="p">(</span><span class="m">19998</span><span class="p">,</span><span class="m">16995</span><span class="p">,</span><span class="m">22491</span><span class="p">,</span><span class="m">27571</span><span class="p">,</span><span class="m">25998</span><span class="p">,</span><span class="m">24000</span><span class="p">,</span><span class="m">15495</span><span class="p">,</span><span class="m">13290</span><span class="p">,</span><span class="m">8444</span><span class="p">)</span>
mileage <span class="o"><-</span> <span class="kt">c</span><span class="p">(</span><span class="m">16568</span><span class="p">,</span><span class="m">68399</span><span class="p">,</span><span class="m">23813</span><span class="p">,</span><span class="m">15156</span><span class="p">,</span><span class="m">17201</span><span class="p">,</span><span class="m">28946</span><span class="p">,</span><span class="m">87440</span><span class="p">,</span><span class="m">83060</span><span class="p">,</span><span class="m">153549</span><span class="p">)</span>
year <span class="o"><-</span> <span class="kt">c</span><span class="p">(</span><span class="m">2012</span><span class="p">,</span><span class="m">2011</span><span class="p">,</span><span class="m">2013</span><span class="p">,</span><span class="m">2014</span><span class="p">,</span><span class="m">2014</span><span class="p">,</span><span class="m">2012</span><span class="p">,</span><span class="m">2010</span><span class="p">,</span><span class="m">2007</span><span class="p">,</span><span class="m">2006</span><span class="p">)</span>
crv <span class="o"><-</span> <span class="kt">data.frame</span><span class="p">(</span>mileage<span class="p">,</span>price<span class="p">,</span>year<span class="p">)</span>
lm<span class="p">(</span>crv<span class="o">$</span>price <span class="o">~</span> crv<span class="o">$</span>year<span class="p">)</span>
</pre></div>
<p>Which yields the basic result that the asking price of the car goes down by $2103 for each year it gets older. </p>
<h2>The Bivariate Example Using Simple Algebra and Arithmetic</h2>
<p>This is the important part. It's tedious, but straightforward, and writing it out all by hand will really remind you that computers are remarkable tools.</p>
<p>The derivation results from Part I tell us:</p>
<p>$$ \frac{\hat{cov}(x,y)}{var(x)} = \frac{ \frac{1}{n} \sum_{i=1}^{n} (x_{i} - \bar{x})(y_{i} - \bar{y} )}{ \frac{1}{n} \sum_{i=1}^{n} (x_{i} - \bar{x})^{2} } = \hat{\beta_{1}}$$</p>
<p>In this case \( \bar{x} \) is the sample mean for years, which is 2011, and \( \bar{y} \) is the sample mean for price, which is $19365.</p>
<p>Expanding the estimator out to its completely tangible form yields the ridiculously cumbersome equation:</p>
<h6>$$ \frac{((2012-2011)(19998-19365))+ ((2011-2011)(16995-19365))+ ((2013-2011)(22491-19365))+ ((2014-2011)(27571-19365))+ ((2014-2011)(25998-19365))+ ((2012-2011)(24000-19365))+ ((2010-2011)(15495-19365))+ ((2007-2011)(13290-19365))+ ((2006-2011)(8449-19365))}{((2012-2011)^{2})+((2011-2011)^{2})+((2013-2011)^{2})+((2014-2011)^{2})+((2014-2011)^{2})+((2012-2011)^{2})+((2010-2011)^{2})+((2007-2011)^{2})+((2006-2011)^{2})}$$</h6>
<p>Which simplifies to </p>
<p>$$ \frac{138787}{66} = $2102.83 $$ </p>
<p>Almost exactly what R told us the coefficient would be!</p>
<h2>The Bivariate Example Using Matrix Algebra</h2>
<p>Things get a little trickier here, but the process is the same, the bivariate estimator is:</p>
<p>$$ \mathbf{\hat{B}} = \mathbf{(X'X)^{-1}X'Y} $$</p>
<p>Substituting our real world data for the general matrices leaves the form:</p>
<p>$$ \left({{\begin{bmatrix}
1 & 2012 \\
1 & 2011 \\
1 & 2013 \\
1 & 2014 \\
1 & 2014 \\
1 & 2012 \\
1 & 2010 \\
1 & 2007 \\
1 & 2006
\end{bmatrix}}}'{\begin{bmatrix}
1 & 2012 \\
1 & 2011 \\
1 & 2013 \\
1 & 2014 \\
1 & 2014 \\
1 & 2012 \\
1 & 2010 \\
1 & 2007 \\
1 & 2006
\end{bmatrix}}\right)^{-1} {{\begin{bmatrix}
1 & 2012 \\
1 & 2011 \\
1 & 2013 \\
1 & 2014 \\
1 & 2014 \\
1 & 2012 \\
1 & 2010 \\
1 & 2007 \\
1 & 2006
\end{bmatrix}}}' \begin{bmatrix}
19998\\
16995\\
22491\\
27571\\
25998\\
24000\\
15495\\
13290\\
8449
\end{bmatrix} = \mathbf{\hat{B}}$$</p>
<p>Which simplifies from 4 sets of matrices down to two as:</p>
<p>$$ \begin{bmatrix}
61274.67171717 & -30.4696969606 \\
-30.469696969696 & 0.015151515151515
\end{bmatrix}
\begin{bmatrix}
174 287 \\
350 629 944
\end{bmatrix} $$</p>
<p>Which multiplies through to:</p>
<p>$$ \begin{bmatrix}
-420 9380\\
2102.83
\end{bmatrix} $$</p>
<p>The first row being the intercept in our regression model; the second being our estimated slope coefficient calculated in both R and using the expanded summation operator by hand!</p>
<h2>The Takeaway</h2>
<p>Derivations, proofs, and advanced topics in econometrics can be really tricky to wrap your ahead around at first. By going through the motions of making real world data stand in for the notation, it can a) help you really understand the proofs and derivations, and b) remind you just how many trillions of human hours computers save in answering complicated technical questions!</p>Customizing Pelican for Static and Dynamic Content2015-09-13T13:55:00-07:00Mischa Fishertag:mischafisher.com,2015-09-13:customizing-pelican-for-static-and-dynamic-content.html<p>As is typical with these sorts of things, the online community has been very helpful in sorting out exactly how I wanted to customize this site when I recently rebuilt it using the Python static site generator <em>Pelican</em>. </p>
<p>A brief summary of a few changes I think were extremely helpful in tweaking the stock <em>Pelican</em> build:</p>
<h2>Using Math</h2>
<p>Mathjax seemed to be the simplest solution. Embedding the information via a short snippet of code linking to their CDN, and then using simple LaTeX notation within Markdown takes only a minute or two:</p>
<div class="highlight"><pre><span class="nt"><script</span> <span class="na">src=</span><span class="s">'https://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML'</span><span class="nt">></script></span>
</pre></div>
<h2>Using Bootstrap for Responsive CSS</h2>
<p>Erik Flowers Bootstrap grid introduction is a very nice resource <a href="http://www.helloerik.com/bootstrap-3-grid-introduction">(Click Here)</a></p>
<h2>Customizing Your Own Theme</h2>
<p>Web developer Robert Iwancz made a great bare bones framework on which one can build out almost anything. <a href="http://www.voidynullness.net/blog/2014/03/30/introducing-voidy-bootstrap-pelican-theme/">(Click Here for his VoidyNullness Theme)</a></p>
<h2>Using Pelican for a Static Landing Page</h2>
<p>Find your .md markdown file that you want to use as your home page content and add the following metadata to the top of the file: (the second change is to prevent your home page from appearing in the menu twice, depending on your theme). </p>
<div class="highlight"><pre><span class="n">save_as</span><span class="o">:</span> <span class="n">index</span><span class="o">.</span><span class="na">html</span>
<span class="n">status</span><span class="o">:</span> <span class="n">hidden</span>
</pre></div>
<p>Then in a separate .md file that will become your blog content, add the metadata identifier at the top to assign the appropriate HTML template from your templates folder:</p>
<div class="highlight"><pre>Title: (Your Blog Title)
Date: (The Date)
Category: Page
Template: (the title of your blog template, without the .html extension
</pre></div>
<p>Finally, copy the content generating loops from the default page templates into your new blog html template:</p>
<div class="highlight"><pre><span class="cp">{%</span> <span class="k">block</span> <span class="nv">content_body</span> <span class="cp">%}</span>
<span class="cp">{%</span> <span class="k">block</span> <span class="nv">article</span> <span class="cp">%}</span>
<span class="cp">{%</span> <span class="k">if</span> <span class="nv">articles</span> <span class="cp">%}</span>
<span class="cp">{%</span> <span class="k">for</span> <span class="nv">article</span> <span class="k">in</span> <span class="o">(</span><span class="nv">articles_page.object_list</span> <span class="k">if</span> <span class="nv">articles_page</span> <span class="k">else</span> <span class="nv">articles</span><span class="o">)</span> <span class="cp">%}</span>
<span class="nt"><article></span>
<span class="cp">{%</span> <span class="k">for</span> <span class="nv">file</span> <span class="k">in</span> <span class="nv">CUSTOM_INDEX_ARTICLE_HEADERS</span> <span class="cp">%}</span>
<span class="cp">{%</span> <span class="k">include</span> <span class="s2">"includes/"</span> <span class="o">+</span> <span class="nv">file</span> <span class="cp">%}</span>
<span class="cp">{%</span> <span class="k">else</span> <span class="cp">%}</span>
<span class="cp">{%</span> <span class="k">include</span> <span class="s2">"includes/article_header.html"</span> <span class="cp">%}</span>
<span class="cp">{%</span> <span class="k">endfor</span> <span class="cp">%}</span>
<span class="cp">{%</span> <span class="k">if</span> <span class="nv">ARTICLE_FULL_FIRST</span> <span class="k">is</span> <span class="nf">defined</span> <span class="k">and</span> <span class="nb">loop</span><span class="nv">.first</span> <span class="k">and</span> <span class="k">not</span> <span class="nv">articles_page.has_previous</span><span class="o">()</span> <span class="cp">%}</span>
<span class="nt"><div</span> <span class="na">class=</span><span class="s">"content-body"</span><span class="nt">></span>
<span class="cp">{%</span> <span class="k">if</span> <span class="nv">article.standfirst</span> <span class="cp">%}</span>
<span class="nt"><p</span> <span class="na">class=</span><span class="s">"standfirst"</span><span class="nt">></span><span class="cp">{{</span> <span class="nv">article.standfirst</span><span class="o">|</span><span class="nf">e</span> <span class="cp">}}</span><span class="nt"></p></span>
<span class="cp">{%</span> <span class="k">endif</span> <span class="cp">%}</span>
<span class="cp">{{</span> <span class="nv">article.content</span> <span class="cp">}}</span>
<span class="cp">{%</span> <span class="k">include</span> <span class="s2">"includes/comments.html"</span> <span class="cp">%}</span>
<span class="nt"></div></span>
<span class="cp">{%</span> <span class="k">else</span> <span class="cp">%}</span>
<span class="cp">{%</span> <span class="k">include</span> <span class="s2">"includes/index_summary.html"</span> <span class="cp">%}</span>
<span class="cp">{%</span> <span class="k">endif</span> <span class="cp">%}</span>
<span class="nt"></article></span>
<span class="nt"><hr</span> <span class="nt">/></span>
<span class="cp">{%</span> <span class="k">endfor</span> <span class="cp">%}</span>
<span class="cp">{%</span> <span class="k">endif</span> <span class="cp">%}</span>
<span class="cp">{%</span> <span class="k">endblock</span> <span class="nv">article</span> <span class="cp">%}</span>
<span class="cp">{%</span> <span class="k">block</span> <span class="nv">pagination</span> <span class="cp">%}</span>
<span class="nt"><nav</span> <span class="na">class=</span><span class="s">"index-pager"</span><span class="nt">></span>
<span class="cp">{%</span> <span class="k">if</span> <span class="nv">articles_page</span> <span class="k">and</span> <span class="nv">articles_paginator.num_pages</span> <span class="o">></span> <span class="m">1</span> <span class="cp">%}</span>
<span class="nt"><ul</span> <span class="na">class=</span><span class="s">"pagination"</span><span class="nt">></span>
<span class="cp">{%</span> <span class="k">if</span> <span class="nv">articles_page.has_previous</span><span class="o">()</span> <span class="cp">%}</span>
<span class="nt"><li</span> <span class="na">class=</span><span class="s">"prev"</span><span class="nt">></span>
<span class="nt"><a</span> <span class="na">href=</span><span class="s">"</span><span class="cp">{{</span> <span class="nv">SITEURL</span> <span class="cp">}}</span><span class="s">/</span><span class="cp">{{</span> <span class="nv">articles_previous_page.url</span> <span class="cp">}}</span><span class="s">"</span><span class="nt">></span>
<span class="nt"><i</span> <span class="na">class=</span><span class="s">"fa fa-chevron-circle-left fa-fw fa-lg"</span><span class="nt">></i></span> Previous
<span class="nt"></a></span>
<span class="nt"></li></span>
<span class="cp">{%</span> <span class="k">else</span> <span class="cp">%}</span>
<span class="nt"><li</span> <span class="na">class=</span><span class="s">"prev disabled"</span><span class="nt">><span></span>
<span class="nt"><i</span> <span class="na">class=</span><span class="s">"fa fa-chevron-circle-left fa-fw fa-lg"</span><span class="nt">></i></span>
Previous<span class="nt"></span></span>
<span class="nt"></li></span>
<span class="cp">{%</span> <span class="k">endif</span> <span class="cp">%}</span>
<span class="cp">{%</span> <span class="k">for</span> <span class="nv">num</span> <span class="k">in</span> <span class="nv">articles_paginator.page_range</span> <span class="cp">%}</span>
<span class="cp">{%</span> <span class="k">if</span> <span class="nv">num</span> <span class="o">==</span> <span class="nv">articles_page.number</span> <span class="cp">%}</span>
<span class="nt"><li</span> <span class="na">class=</span><span class="s">"active"</span><span class="nt">></span> <span class="nt"><span></span><span class="cp">{{</span> <span class="nv">num</span> <span class="cp">}}</span><span class="nt"></span></span> <span class="nt"></li></span>
<span class="cp">{%</span> <span class="k">else</span> <span class="cp">%}</span>
<span class="nt"><li></span>
<span class="nt"><a</span> <span class="na">href=</span><span class="s">"</span><span class="cp">{{</span> <span class="nv">SITEURL</span> <span class="cp">}}</span><span class="s">/</span><span class="cp">{{</span> <span class="nv">articles_paginator.page</span><span class="o">(</span><span class="nv">num</span><span class="o">)</span><span class="nv">.url</span> <span class="cp">}}</span><span class="s">"</span><span class="nt">></span><span class="cp">{{</span> <span class="nv">num</span> <span class="cp">}}</span><span class="nt"></a></span>
<span class="nt"></li></span>
<span class="cp">{%</span> <span class="k">endif</span> <span class="cp">%}</span>
<span class="cp">{%</span> <span class="k">endfor</span> <span class="cp">%}</span>
<span class="cp">{%</span> <span class="k">if</span> <span class="nv">articles_page.has_next</span><span class="o">()</span> <span class="cp">%}</span>
<span class="nt"><li</span> <span class="na">class=</span><span class="s">"next"</span><span class="nt">></span>
<span class="nt"><a</span> <span class="na">href=</span><span class="s">"</span><span class="cp">{{</span> <span class="nv">SITEURL</span> <span class="cp">}}</span><span class="s">/</span><span class="cp">{{</span> <span class="nv">articles_next_page.url</span> <span class="cp">}}</span><span class="s">"</span><span class="nt">></span>
Next <span class="nt"><i</span> <span class="na">class=</span><span class="s">"fa fa-chevron-circle-right fa-fw fa-lg"</span><span class="nt">></i></span>
<span class="nt"></a></span>
<span class="nt"></li></span>
<span class="cp">{%</span> <span class="k">else</span> <span class="cp">%}</span>
<span class="nt"><li</span> <span class="na">class=</span><span class="s">"next disabled"</span><span class="nt">></span>
<span class="nt"><span><i</span> <span class="na">class=</span><span class="s">"fa fa-chevron-circle-right fa-fw fa-lg"</span><span class="nt">></i></span> Next<span class="nt"></span></span>
<span class="nt"></li></span>
<span class="cp">{%</span> <span class="k">endif</span> <span class="cp">%}</span>
<span class="nt"></ul></span>
<span class="cp">{%</span> <span class="k">endif</span> <span class="cp">%}</span>
</pre></div>
<h2>Hiding Pages from the Menu</h2>
<p>Quite simple, just change the metadata on any given page's .md file to:</p>
<div class="highlight"><pre><span class="n">Status</span><span class="o">:</span> <span class="n">Hidden</span>
</pre></div>
<h2>Embedding Data from a CSV</h2>
<p>To generate my reading list, I use a piece of javascript that references the base .csv containing all the books. The script is fantastic because it allows all the flexibility of using a .csv to store the data, without needing a server side scripting language (which a static site on Amazon S3 doesn't have the luxury of relying on).</p>
<div class="highlight"><pre><span class="nt"><div</span> <span class="na">class=</span><span class="s">"d3-chart"</span><span class="nt">></div></span>
<span class="nt"><script</span> <span class="na">src=</span><span class="s">"http://d3js.org/d3.v3.min.js"</span><span class="nt">></script></span>
<span class="nt"><script</span> <span class="na">src=</span><span class="s">"d3.min.js?v=3.2.8"</span><span class="nt">></script></span>
<span class="nt"><script</span> <span class="na">type=</span><span class="s">"text/javascript"</span><span class="na">charset=</span><span class="s">"utf-8"</span><span class="nt">></span>
d3.text("/images/data.csv", function(data) {
var parsedCSV = d3.csv.parseRows(data);
var container = d3.select(".d3-chart")
.append("table")
.selectAll("tr")
.data(parsedCSV).enter()
.append("tr")
.selectAll("td")
.data(function(d) { return d; }).enter()
.append("td")
.text(function(d) { return d; });
});
<span class="nt"></script></span>
</pre></div>
<h2>Using a Contact Form</h2>
<p>FormSpree is ridiculously headache free and works very well <a href="http://www.formspree.io">(Click Here)</a>. </p>
<h2>Adding a Parallax Effect</h2>
<p>Using Javascript is an option since the browser does the work, rather than a server; however, I found a pure CSS approach to be cleaner and easier. <a href="http://keithclark.co.uk/articles/pure-css-parallax-websites/">Keith Clark's demo HERE</a> is superb. </p>The Pros and Cons of Using a Static Site Generator2015-09-12T16:55:00-07:00Mischa Fishertag:mischafisher.com,2015-09-12:the-pros-and-cons-of-using-a-static-site-generator.html<p>In a recent effort to re-gear this site toward the quantitative and away from the strictly artistic, I rebuilt the site from scratch with one singular aim: make posting effortless. In that spirit, I was directed by a <a href="http://sahandsaba.com/" title="Sahand Saba">good friend</a> toward static site generators; a new development in the 8 or so years since I had last looked at any of the technologies surrounding web development. </p>
<p>With the new site up and running, here is a brief list of the pros and cons, as I see them, of using a static site generator:</p>
<h2>Pros:</h2>
<ol>
<li>Effortless posting:</li>
</ol>
<p>Write posts in Markdown, then upload with a few keystrokes straight from the terminal. </p>
<ol>
<li>Cheap and scalable hosting: </li>
</ol>
<p>I'm using Amazon's S3 for hosting, and Route 53 for DNS services. They're almost free in low traffic, and infinitely scalable in high traffic. </p>
<ol>
<li>No backend to maintain: </li>
</ol>
<p>PHP, SQL, and the slow load times and unresponsiveness of shared servers on most hosting plans are a thing of the past. (I'm looking at you GoDaddy.com)</p>
<ol>
<li>Easy to backup or migrate: </li>
</ol>
<p>I have all the website files in the location of my laptop being backedup to Dropbox in realtime. In addition, version control through something like Git is also handy, particularly when messing around with the underlying Python scripts that generate the site.</p>
<h2>Cons:</h2>
<ol>
<li>Steep learning curve: </li>
</ol>
<p>The list of technologies one has to look at include: <em>HTML, CSS, Python, JavaScript, Markdown, the Terminal, Pelican, FontAwesome, Jinja, Bootstrap, s3cmd, pip, brew,</em> and virtual environments. </p>
<ol>
<li>Longer to get setup: </li>
</ol>
<p>With a squarespace account you can be up and running in minutes. And it will look a lot prettier by default.</p>
<p>For me those were the biggest pros and cons I weighed in setting this site up. (Your mileage may vary.)</p>
<!--- <img class="img-responsive img-circle img-center" src="/images/z_weddings.jpg" width="350"/> -->