Many Dimension Of How Net Neutrality Can Be Interpreted And Used


A European Union agency has said that mobile network’s Three’s plans to offer ad-blockers would violate net neutrality, which I think is the perfect example of how laws trying to protect the virtue of the Internet will ultimately play out. I’m not saying that we shouldn’t try to have these laws in place, it is more a nod to the dark creativity of capitalism to find a way around the containment powers of laws and regulation.

I am pretty sure that the experiment we know as the web is over as we knew it. Relevant to this story, I am pretty confident the lust for advertising has been one of the main reasons the web is so fucked up. There is just too much money in the game now. The stakes are high, and people are too greedy to go back to the way things were. Sadly, thinking about loopholes in laws, and understanding how to twist and bend good laws for those in power is how some smart people enjoy spending their time.

The social walled gardens we tend like Facebook, and the aggressive nature of the public social gardens we loiter, will continue to drive us to the world where we ask for net neutrality to be broken. We will demand a clean, sanitized, and ultimately the corporate vision of the web because it is safe and more secure than the alternative. While I am fascinated by the dimensions of how net neutrality can be interpreted and put to use, I’m saddened that we couldn’t make the web work as it was intended by its creators.




Technology Carnival Barker Prediction Market Will Be Worth 2 Trillion By 2021


One of the themes that really stand out in my monitoring of the API space lately, is the quantity of information that is flowing around out there trying to convince all of us about what the next thing will be. When you are down in the weeds these things tend to flow by and do not mean much, and some even seem like they might be a reality. However, after you step away for a bit , and then come back, you really begin to notice how much is dedicated to shaping our reality across the digital space.

It feels like I’m walking through a real-time digital carnival with barkers coming at me every couple of blog posts and tweet. Believe in our DevOps solution! Come see my analytics dashboard here. IoT in the home is what you should be thinking about. Are you investing in VR? Blockchain will do it all! It is interesting to watch the themes that have legs, and the ones which are one night stands. Some of these are interesting carnival ride but most are barely a dime show attraction.

If I cared more, I’d be tracking on the claims being made, and associate them with a company, and even individual–keeping a score card for claims being, and what kind of track record people actually have (This will go down on your permanent record). Except I have better things to be doing and have to constantly work to stay focused on what truly matters–even amongst all this noise. I’ve never been one to make predictions like a real, pants wearing industry analyst, but think that the technology carnival barker prediction market will be worth 2 trillion by 2021.



Activated Two Factor Authentication With All My Critical Accounts


As I work to maintain my online presence, I am always looking for ways to keep my presence, data, and content-protected. My latest crusade is focused on two-factor authentication. While I did have two-factor authentication enabled for Google, I did not have it enabled for Github, AWS, and Apple. I am not sure why I hadn’t, probably just a time thing, but now they are all activated.

I’m thankful that AWS, Github, and Google all use the Google Authenticator app which centralizes my management of the codes required to validate I am who I am. With all the hacks going on, specifically the most recent one from Yahoo, I am stoked to be using 1Password to manage all of my accounts, as well as employing two-factor authentication wherever it is available–especially on the accounts that are most import important to me.

If you aren’t familiar with two-factor authentication it is a secondary way for platforms to validate who you are when your password is being changed, or your account is being accessed. Platforms can validate you with SMS or via the Google Authenticator app, but recently SMS has been deemed insecure–so try to rely on the authenticator solution when possible. If a service you depend on doesn’t use two-factor, make sure and let them know it is important to you–there is even a handy service that will help you do this.

In the current online environment, we need all the protection we can get. Two-factor is currently one of the most important ways we can defend the online services we depend on. Make sure you active it on all your critical accounts–I recommend starting with your primary go-to locations like Apple or Google.



My Client Side YAML Editor Running 100% On Github Pages


I am working on a base template which acts as a seed for all of my micro tools which run 100% on Github. I have a number of little micro tools I want to build, and I needed a series of core building blocks that I could fork and reuse across all my tools. I produced this base template, and next up was creating a simple YAML editor, that would allow me to edit the YAML file in the _data folder for any Jekyll site hosted on Github Pages.

The objective here is to provide little applications that use the native functionality of Jekyll and Liquid for displaying static, data-driven applications, and when I need to write data back to the YAML files I use JavaScript and the Github API. The critical piece is the authentication with the Github API in a 100% client-side environment (ie. Github Pages). This is something I’ve used before in the past, and my own Github OAuth proxy, but for these projects I want them to be forkable, and all you need is a Github personal token to access (which all Github accounts have) to access.

YAML In The _data Folder In Jekyll Sites
When you put YAML into the _data folder for any Jekyll driven site hosted on Github some interesting things become possible. Jekyll takes the YAML and loads it up as an object you can easily reference using Liquid Markup. It all makes Jekyll perfect for building little data-driven micro tools, with the YAML as a core. If you need JSON, XML, Atom, or other representations, you can easily publish pages that output in these formats.

This YAML is accessible in the _data folder for the Github repository that houses this site. I just want to provide a simple Github Gists for reference in this story. This YAML will be the driver of static, as well as dynamic content and data used across this prototype micro tool.

Static Listing of YAML data Using Liquid
Next up is to display the YAML data available in the _data folder and render it for viewing. Using Liquid I am able to dynamically generate and HTML listing of all the data in the YAML, acting as the static listing of the products that I have in my data store. Here is the Liquid that dumps all the products in the YAML file:

Liquid does have a learning curve, but once you get the hang of it, and have some base templates developed–it gets easier. I’ve been able to recreate anything that I would accomplish dynamically with PHP and a MySQL database, but using Liquid and YAML data stores.

Editing YAML Files Client Side With Github API
I have more YAML data store, and a basic static listing of the data store, now I want to edit it. Using Github.js and the Github API I am able to mount the YAML files in the _data/ folder and list out on the page with text boxes for editing. This obviously won’t work for very large YAML files, but for sensibly structured data, kept in my small bits it will work just fine.


Once I’m done working with the data, I can save the form, and I wrote some JavaScript that traverses the form, updating the YAML file using the Github API. The trick is that this form, reading from the YAML file, and writing to it via the API isn’t allowed unless you pass in a valid personal token from a Github user who has access to the underlying repository.

My 100% Client Side YAML Editing Micro Tool
That is it. This is my base template for any micro tools I build that will allow for reading and writing YAML files that are stored in the _data folder. It may not seem like much at first glance, but it opens up for me a wealth of new possibilities when it comes to data-driven tooling and collaborative projects that run on Jekyll (not just Github Pages, as Jekyll can be hosted anywhere).

First, I’m offloading application operations to Github. Sure Github has some issues from time to time, but I’ve been running 100% on them for over two years, and it is more than sufficient for most needs. Github scales and secures my applications, and I don’t have to be concerned with keeping the infrastructure up and running–I just have to keep track of my personal tokens.

Second, these apps are self-contained and forkable. Anyone can fork it, grab their own personal token, and get to work managing the YAML core of their application. This is important to me. I like my tooling like I like my APIs, little, forkable and disposable, and truly open source tooling that anyone can put to use.

This is just base template prototype. I’ll come up with more sophisticated versions soon. I just wanted to get the base template for running apps 100% on Github together, then this simple example of reading and writing YAML data from a _data folder before I moved on. I have a couple of API micro tools I want to develop in the area of API design, and I needed this functionality to make it all come together.

The base micro tool template and this base micro tool YAML template are both on Github.



My Forkable Base For Building Apps That Run 100% On Github


Github provides a very powerful platform for developing applications. When you use the base Github functionality, in conjunction with Github Pages, and the Github API–some pretty interest approaches to application deployment emerge.

I learned this approach from Development Seed while working with the White House to open up data across federal government agencies, but is an approach I have evolved, and improved upon while developing what I am going to call Github micro tools.

My Github micro tools run 100% on Github, using Github Pages as the front-end, the Github repo as a backend, and the Github API as the communication between–with Github OAuth as the security broker of who can put the application to work.

I needed to use this approach across several different micro tools, so I thought I’d create a base template that I can use as forkable base for these tools I’m building, while also sharing the approach with others.

Apps Running 100% On Github

I like my apps like my APIs–small and reusable. Building applications that run entirely on Github makes sense to me because it is focused on developing apps that anyone can fork and put to use under their own account–relying on Github to do all the heavy lifting, and cutting out the middleman (me). Each micro tool runs as a Github repository, which comes with all the benefits of Github like versioning, social coding, issue management and much more. You can fork my project on Github, and begin using within your Github user account or orgnization.

Github Pages As Application Front-End

One of the interesting features Github provides with each repository is the ability to launch a simple static site using Github Pages. I use these static project sites to run all my API project and is something I have been evolving it to be a front-end for this approach to providing micro tools. Github pages provide a simple place to put al my applications, where I can store and manage in a very static, secure, and stable way (well the security and stability is offloaded to Github).

Static Jekyll Application Front-End

Jekyll provides a simple, static way to help tame the front-end of the applications I am building. The static content management system provides tools for managing the look and feel of each application, the pages within, and allow me to have a blog if I want. Additionally, Jekyll provides a YAML and JSON core, which when combined with Liquid and JavaScript, opens up to some pretty interesting opportunities for building applications.

Github API As An App Connector

With the base of an application, I am using the Github API as the connector for reading and writing data and content to the base Github repository for this application, in addition to relying on the native features available in Jekyll, and Liquid. The API allows any application to access its underlying data store when a user is properly authenticated using a Github personal OAuth token.

Github OAuth for Authentication

To allow this application interaction to securely occur I am relying on Github OAuth as the gatekeeper. For this example, I am using a Github personal tokens retrieved from within any Github account, instead of using a proxy or service like because I want this solution to be forkable and self-contained. Your tokens will not give you access to this application when it exists under my Github account, but if you fork it, your tokens will give you access to only your forked version. All you do is pass a token into this page using ?token=[your token here], and the API will allow for writing to the underlying repository.

Cookie.js To Store The OAuth Token

Once the OAuth token is passed into the URL I use cookies.js to store the token for use across all potential pages of a micro tool. This approach helps prevent it being included in any links and passed between pages. Once each cookie expires, the user is required to pass another valid token in through the URL to set the cookie again, opening up API access to the applications backend. This project is meant to be interactive.

Github.js To Communicate With API

With a valid OAuth token, I use Github.js as the client side JavaScript client for interacting with the Github API. While Github.js allows for using almost all available API endpoints, most application functionality will be just about reading and writing YAML and JSON files using repository paths. For most of the application functionality, I will rely on Liquid for reading YAML and JSON data, and Github.js for writing of data to the underlying repo using the Github API. If you have a valid Github OAuth token passed in and have access to the Github repository for this application.

Forkable Base For Apps That Run 100% On Github

I hope this provides a base project that demonstrates what is possible when you build applications on top of Github. I am going to fork it and build another prototype that reads and writes to a YAML file in the _data folder for the underlying repo, exploring what is possible when it comes to using Github as a data-driven micro tool platform.

The code that makes this happen is pretty simple, and the Github repository is meant to be pretty self-contained, and here are list of technologies at play here:

You can find the front for this app at, and the repo behind this project over at my Github account. Have fun, and feel free to submit any issues if you have any questions or comments.



My Response To People Trying To Sell Me Email Lists of Corporate Users


If you are in the business of technology like I am, you probably get the random emails from people trying to sell contacts from leading technology companies. They are usually pretty savvy at getting past SPAM filters and are persistent at trying to sell the information of leading companies from SalesForce, Microsoft, Amazon, and pretty much any other company out there.

Like most of the SPAM in my inbox, I just flag and move on, which I have done with these types of emails, but they keep coming, so I crafted a template email to send back to them, like I do with many of the solicitations I get (I have a folder of templates).

Thank you for your unwanted solicitation. I hope you are doing well (I do not care, but this is what you do right?) I’m in the business of being a human in the technology space, not making a profit off of selling other people’s information, but hell there is good money in it right?

Would you be interested in buying the contact information of people who sell other people’s contact information? You see, I track the IP address and other details of every email I receive trying to sell me contacts. I then conduct research on who they are, discovering their name, home address, phone number, and where their children go to school.

If you think this is a good thing to do, and would like to buy these from me, please send me $$$$. Cause I’m greedy bitches. Please go the fuck away and get a life.


Kin Lane

I’m sure many of these people are just poor people doing the bidding of some pretty sleezy people who think this is a good business idea. I can’t help but push back, especially when they get through the filters and take moments of my time away. Even though it is just seconds, it is still my valuable time.

I know that not everyone can find employment that is ethical and worthy of being proud of, but maybe I can scare a handful of folks to look for employment elsewhere, and move on. If not, at least I’m having fun, and I feel a little better.



Keeping Things Static With My Public Presence To Reduce Security Friction


I’ve been pretty vocal about running the API Evangelist network of sites on Github Pages, ever since I first started doing it back in January of 2013. Back then I was just playing around with the concept, but in 2016 my entire public presence runs on Github Pages.

There are several reasons I do this, starting with the simplicity of static website solutions like Jekyll, something that quickly evolves when you marry with the social approach to managing code that is Github. I like managing my sites this way, but the primary reason I migrated to this setup was because of security. After a couple of online events where I stepped up to defend my girlfriend Audrey Watters (@audreywatters) I woke up to all of my sites being down, by some friendly hacker.

I admit I don’t have the best security practices. I have the skills to do it, but everything I do is public, so security is really not a concern. I just don’t want my shit taken down by someone, or have my readers experience an outage. I got backups of things up the wazoo, in three different locations, including a nuclear missile silo in Nebraska. I can restore and rebuild at any point, but I don’t like people taking my sites down just because they disagree with me.

So I moved everything to run on Github a couple years ago. I’ll outsource my security to them. All of my API industry research projects have a JSON core, driving the data, content, and API definitions for the APIs I create and keep an eye on–often times there are code samples, libraries, and other open tooling as well. So I’d say that my “websites” meet the criteria of being a worthy project for hosting on Github Pages. All of my research, except what ends up in a PDF, is meant to be open, forkable, and remixable–so Github just works for me.

With this move to being static my world became a dynamic push, instead of a dynamic pull, which significantly reduces the attack surface area for hackers–well except for the part where Github is hosting my sites, and I’m outsourcing security to them. At least it isn’t my responsibility, plus I get the network effect of being on Github. When this is coupled with CloudFlare for my DNS, and offloading my DNS security to their experts, I figure I’m coming out ahead when it comes to securing my public presence, and what is most important to me–my research.

I still have my administrative API monitoring system (which is dynamic), something I will be working to further localize on my workstation, and a local server–it doesn’t need to be on the Internet all the time. Then, all that is left then is my API stack–a stack of simple web APIs that help me operate the API Evangelist network. I will have to secure my APIs, but it dramatically reduces the publicly available surface area I have to defend, something that helps ensure my static presence will always remain available–even if my APIs go away.

In the current online environment I am not one to pull back from using the cloud after all I have invested in it, but with the volatility that lies ahead, it makes sense to keep my surface area defined, including all domains, and 3rd party services, and reduce the size of it at every turn. When possible, it also makes sense to go static, something that I’m seeing reduce a lot of friction and concern for me when it comes to maintaining my very public online existence.