“Ready Player Two” was fun and took me on a journey I was not expecting. After reading the book, I was not surprised by the direction that Ernest took with the book. The classic AI armageddon storyline, and it wasn’t bad. I have not read a bunch of sci-fi books or books about AI. The genre is a newer discovery for me. After reading a bunch of reviews on GoodReads.com, I was disappointed to see that other seasoned authors were let down by the book. I can say that during reading the book when I found out there was a neural network interface I knew that Sam would be against it and the rest would be for it. I was able to predict that at least.
I however was extremely pleased with the book and the new cultural references that it included. Having some of the final scenes be from the LOTR universe was exciting, and something that I was not expecting. It has inspired me to go back and reread all 5 books, The Hobbit, LOTR, and Simarillion. I hope that they take the time to turn this into a movie, some of these visuals would be amazing.
Overall, I thought that this book was great, and Ernest Cline did not disappoint me. I could not put this book down and read it visually in two weeks, which is historically really fast for me. I was letting my wife drive places so I could have my head buried in the book, much like my teenage self reading the Junior Jedi Knights series!
]]>I knew that after learning the basics of states, that I needed to learn someway to write React in a TDD environment. The article mentioned above, as well as the documentation provided by Facebook did not mention this. So, I did the normal developer thing, I Googled it. Which brought me to this little gem of a blog post. The content is incredible, and it in fact does work and taught me how to get Mocha running with React. However, if you look at the dependencies there, it lists Gulp as a dependency.
I know, I know, all you magpies are in love with Gulp and Grunt and whatever G word you can think up with to build your environment. I on the other hand have recently been trying to keep it simple. I have been doing everything in my power to write on NPM scripts, and stay away from the extra complexity that G’s bring. So, I went over to Lena’s Github profile, and found her repository for react-mocha-example and forked it. And then the obsession grew even deeper.
HULK SMASH!!! and done, Gulp is removed. No, no, I will provide you with more context. So, first thing that I did was ran npm install
just to make sure that my tests pass before I did anything. We are green, awesome. Now, what happens if I run npm uninstall --save-dev gulp gulp-load-plugins gulp-mocha gulp-open
? Well, it uninstalls those modules of course. And then we run npm test
again, and we are green. Okay, phew I didn’t break anything yet. Wait, look in package.json what is this vinyl-source-stream
package? I have no idea, but it has the word stream in it, so it must have something to do with Gulp. A quick search on npmjs.com and BAM, I found it! In fact vinyl-source-stream
actually is a Gulp related package. Alright, sweet let’s blast that guy too, npm uninstall --save-dev vinyl-source-stream
. Alright, run npm test
again, and we are green. SWEET! That just removed five unneeded plugins from this repository. HELL TO THE YEAH! Now, lets delete the gulpfile.js
as we will not need it anymore, and then commit.
Alright, browserify
is not a bad package at all. However, I just didn’t see the need for us to have it in this repository at this time. So, lets npm uninstall --save-dev browserify
and then run npm test
. Yup, we are green. Alright, we are doing pretty good so far, not a big deal. Let’s commit and then look at package.json
.
Alright, this is where it began to get a little hairy. I didn’t want to have Babel in the package list at all. So, I npm uninstall --save-dev babel babel-core babel-preset-es2015 babel-preset-react
and run npm test
. Holy mother of moses, we have errors. All over. Shoot. Okay, let’s undo that really fast git co package.json && npm install
. Let’s look at that list that we just uninstalled. At this point in my learning, I am willing to bet money that a package with the name “react” in it is most likely needed. So, let’s try the next one in line npm uninstall --save-dev babel-preset-es2015
and then run npm test
. Oops, nope still getting errors. Okay, again let’s undo that git co package.json && npm install
. Alright, next in line let’s ditch babel-core npm uninstall --save-dev babel-core
and then run npm test
. “Dammit, man! I’m a doctor, not a physicist!” Alright, we have to have that one too. Okay, lets undo that again git co package.json && npm install
. Last try here, npm uninstall --save-dev babel
and then run npm test
. DING DING DING! We have a winner! Looks like we don’t need babel in our package list. Well, at least that gets rid of one package. Let’s commit and move on.
At this point, we have removed everything from the package.json
that we can. However, I did notice when we were running npm install
that we were getting some warnings and deprecation notices. So, let’s go ahead and delete the "devDependencies"
element out of our JSON object in package.json
. Then we can run npm install --save-dev babel-core babel-preset-es2015 babel-preset-react jsdom mocha mocha-jsdom react react-addons-test-utils react-dom
and of course run a quick npm test
to make sure we are still green, and we are. Great. Let’s commit here and move on.
One thing that I noticed on that last npm install
command was a deprecation warning about mocha-jsdom
and to move to using jsdom-global
. So, let’s just go ahead and do that npm uninstall --save-dev mocha-jsdom
and then npm install --save-dev jsdom-global
. Of course, lets run npm test
again. And we have another failure. That is okay, pretty easy to fix that one. Let’s vim test/component-test.js
and change line 4 to var jsdom = require('jsdom-global');
, and then run npm test
. Sweet, that was an easy fix and we got rid of one of the deprecation warnings. Let’s commit at this point.
After that I just changed the example code to be a little more generic and use things like “Hello World!”, instead of the chipper “Lovely! Here it is - my very first React component!”. BAH HUMBUG!!! Mr. Scrooge no like chipper. You can see the majority of those changes here.
That is it. You now have a basic React and Mocha test framework that you can duplicate and use on every project. Pretty simple right? I thought so, and it was a lot of fun to HULK SMASH NPM packages for an hour or so. Well, I am off to grab some shut eye before my daughter wakes up; but tomorrow I will be learning more React and possibly starting the job hunting app I have been thinking about. Until next time, “You stay classy, San Diego!”
P.S. I technically was able to completely remove the Babel dependency from the project, however because Facebook has official endorsed Babel, I didn’t include those steps in this post. You can check out the PR and read more about the Babel/JSTransform fight to the death here!
]]>I think that my greatest strength as an engineer is how much I pain attention to detail. I pour my heart into every piece of code that comes from my fingers, or my pairs fingers. Due to my heart being attached to every piece of code, I do everything that I can to make my code a piece of art. I am obsessed with the nerdy things like whitespace, semicolons, same line curly braces, etc. I want the next person who comes along after me to be able to read and update the code that I write with ease.
I think that my greatest weakness as a engineer is that I tend to take on too much responsibility in an iteration, which makes me tend to work fast. This usually results in my code suffering in quality, and things like whitespace being wrong. I know, the world is going to end right! It is something that has been brought to my attention at my last employer, and it is something that I have been and am going to continually work on throughout my career. I can still live to my nickname of “FedEx”, and slow down and make sure that mistakes are not made.
]]>Since I am by no means a programming expert (yet, I will get there!), I will not try to completely explain categories instead I will just make some notes on the things that I learned about them during the pairing session. Categories do not add methods to the parent class at compile time like most other classes, instead it actually is adding them at runtime. This allows for the interpretor to basically layer the classes on top of each other. Let us look at an example our core class Bob
:
class Bob
{
String drinkCoffee()
{
String cup = '16oz'
return cup
}
}
In our Bob
class above, we have a method called drinkCoffee
. In this method we simply create a string called cup
and then set it to '16 oz'
. We simply just then return our string cup
at the end of the method. This is just a basic class and method, I am sure that your codebase is much more complex, as is the codebase at my work, however for this example we will keep it simple. Let us get our category on!
class Bill
{
use(Bob) {
String drinkCoffee()
{
String coffee = 'French Roast with Cream and Sugar'
return cup + coffee
}
}
}
If we look here, we use a category by use(Bob)
and then extend the functionality by calling the same method drinkCoffee()
. As we can see here, we don’t redeclare String cup = '16oz'
because we don’t want to change it. The category principle basically allows us to inherit the cup variable from the parent class. We then set coffee with String coffee = 'French Roast with Cream and Sugar'
. At the end of the method we then just return cup + coffee
. The category allows us to easily extend the base method and include our own data and functionality and inherit the variables from the parent method.
In our particular case, we were working on a feature where we needed to add image url data to the SKU object. So, we knew that we already had a core class that generated a custom Color object, so let’s consider that our OOTB class that we can extend to make more robust. We also know that our data lived on the Color object’s parent object of Product. Since we were working with Oracle Commerce (OC), my partner showed me a really easy way to view the data that is being delivered to any OC page. You can simply append this query string onto the URL and it will show you all the data, ?format=json
. Once you have the format though, at least in Google Chrome, it looks like just a giant unreadable string. Have no fear, my fellow engineer; there is an extension for that! I personally use JSONView, which will re-render the JSON into a readable format for you.
One more thing to remember before starting your feature, is set this rule for yourself, “Avoid, at all costs, changing a core class!” I have not done a lot of programming on top of another platform, like OC, most of my work has been done on a from scratch project. However, I have been told from a few coworkers and mentors that you can almost always get away with never changing a core class. For instance, at work the brave souls that we call Architects have designed a platform that sits on top of Oracle Commerce and enhance the functionality and accessibility. Therefore, we call this platform our “core” now. When we started our newest project, we have been extending those core classes to provide the specific functionality that is not included in our core platform. Sometimes this requires being extremely clever, sometimes it does in fact mean having to refactor a core class so that we can properly extend it in the new project. We have been using a type of classes called Categories, also known as partial classes.
]]>I was reading through some of the post and getting discouraged very quickly. Everything in my dotfiles was mapped differntly then what thoughtbot had. I begain to think about this and realized that most of thoughtbot’s materials probably assume that you are using their laptop setup. What if you are not using their setup? What if you are using a laptop from work that your boss set up for you so that you can have the same setup as the rest of the team, as to make pairing a lot easier. How do you learn the key bindings to things like vim or tmux? Well I have come up with a small tip: READ THE CONFIG FILES!
You might be thinking to yourself, I have looked at those config files and they are confusing and hard to understand. Let’s quickly go over some details and things to look for so that you can begin to understand how to pick out the good things in the config files.
You can see my .tmux.conf
file here.
In my configuration my command keys for tmux is ctrl+a
. This can get cumbersome to type, unless you remap your caps lock
key to ctrl
. Then all you do is press caps lock + a
followed by a command.
bind-key
or bind
When you see bind-key
or bind
in your config file, to my understanding, this is a way to set shortcuts. So, I read bind-key
as ctrl+a
(caps lock + a
).
bind-key
s or bind
s to notebind-key | split-window -h
.bind-key - split-window -v
.bind-key x kill-pane
.ctrl + a <key>
To get better ideas of what to look for and to figure out some key commands, thoughtbot has these two great links:
Here is to *muxing your future!
]]>For me it was extremely enlightening to see a company, who does a lot of gem work, benchmark their own gem. It was great to see that they don’t always use FactoryGirl in their test suites.
It was awesome to see that a company so highly thought of say that it is okay to not persist data in your specs. I have been using FactoryGirl for a few years now, and admittedly probably too much. I thought that it was always necessary to persist the data, but to see that its 100% faster to not persist data is going to make me think twice next time I create a test.
Go test things and don’t always persisting data!
]]>In the podcast Chris (at least that is who sent me the email about this podcast), is describing how to make a routes folder for a book store. He eventually describes and gets to this point in the code:
Rails.application.routes.draw do
resources :books do
member do
patch :publish
patch :unpublish
end
collection do
patch :publish_all
post: import
end
end
end
He then describes a very common case in the startup world, you pivot and begin to need to sell products. He describes a scenario where you do not yet have products other then books to sell, but you decide that you want to start to switch over your URLs to start getting your customers use to this type of path url. This is where the light bulb went off in my head. He adds 15 characters to his routes file and the whole world changes:
...
resources :books, path: "products" do
...
This now changes your URLs from /books/:id/publish
to /products/:id/publish
. This is amazing to me. He only changed his URL schema, he does not have to go through his app and change his publish_book_path
as it is still a viable path. The routes files just forwards new requests to /products/
.
This is obviously an inbetweener step, and you will eventually have to go in and change it to be resources :products
and then change your controller name and all that good stuff, but this will allow you to start indexing your product URL schema without any real changes to your app.
This is my mind blown moment for the day!
]]>I have been writing tests for about a year now and I have always just done Integration/Feature tests because they were easier, and I really didn’t know how to use mocks and stubs. This was okay until our test suite at work reached a few minutes to complete because of all the feature specs. Now that I have read Fowler’s article, I understand what they are now.
Stubs provide canned answers to calls made during the test, usually not responding at all to anything outside what’s programmed in for the test.
Martin Fowler
Mocks … objects pre-programmed with expectations which form a specification of the calls they are expected to receive.
Martin Fowler
In general stubs will use state verification and mocks will uses behavior verification. As Fowler talks about in his article, a stub would tell you if your message was marked as sent in a database and your mock would make sure that the proper methods where called and that they were sent the right things.
You can often use mocks to test that a third-party service is called. Such as below:
third_party_mock.should_receive(:create).with(
user_guid: user.guid,
account_guid: account.guid,
attr_1: attr_1,
attr_2: attr_2
)
This allows us to make sure that the behavior is called and that the correct elements are sent with the call.
You can often use stubs to fake out states of variables that that it returns the required data. You can stub out how many elements are returned per page for pagination like so:
ArticleController.any_instance.stub(:articles_per_page).and_return(1)
This is nice because instead of having articles_per_page = 10
and then have to generate 11 articles in your test, you can set it to be 1 and then you only have to create 2 articles to test that your pagination is working.
I think that it finally makese sense now and now that I have a grasp on when to use mocks vs. stubs I can go and test the world!
]]>I recently hit a problem with using these dotfiles in that I wanted to make some changes to my setup but still wanted to be able to contribute things back to my coworkers dotfiles. So, I started to do some research and was able to get ahold of @r00k and @croaky on Twitter and @croaky mapped me to this blog post from thoughtbot.
It took me a little bit of thinking and trial and error but I have finally figured it out. So, here we go…
Lets clone your forked dotfiles repo down (replace
git clone git@github.com:<github_username>/dotfiles.git
Then we can add a remote called upstream, which will the repo that you forked yours from:
cd dotfiles
git remote add upstream git@github.com:benniemosher/dotfiles
Now we are set. We can now update our master branch from our upstreams master branch:
git fetch upstream
git rebase upstream/master
This will allow us to get any updates that our upstream repo pushes out. We can now change anything that we want in our forked dotfiles repo and push it to github. We can begin to personal our dotfiles and still be able to retrieve updates from our upstream repo.
Let’s say that you have wrote this amazing macro for your Vim setup and you want to push it out to your upstream repo.
Simply checkout your upstreams master branch under a branch named upstream:
git co -b upstream upstream/master
This will remove all of your Vim customizations and set you back to your upstreams default. We can now make a change in the upstream branch, push it to github, and then make a pull request (PR) comparing our upstream branch to upstream/masters branch. This will allow you to push any customization that is worthy of being in the upstream repo up, and then create PRs of only those changes. Once the PR has been approved and merged into our upstream repo, we can update our local copy with those changes:
git fetch upstream
git rebase upstream/master
We do a rebase here instead of a merge so that we can try to avoid as many conflicts as possible. This should take the commit in our upstream branch and replay it on top of all of our customizations. Now we just simply rinse and repeat!
Happy Viming!
]]>