Slow Responses from the BrainTree Ruby Gem? Try This Fix.

A few weeks ago I was tasked with trying to mitigate some timeout issues in a client’s Rails app making BrainTree calls. This was becoming more of a problem as the client’s users built up more & more history in BrainTree. Apparently you can’t paginate the results or ask BrainTree to exclude certain parts of the response via the API. So you can end up getting two years worth of transaction history that you don’t even care about attached to the piece of data that you do care about. As you’ll see, parsing that potentially big ball of XML can become a problem.

I started by outputting timestamps of the interactions with BrainTree to see if the slowness was on our side or theirs. For many calls it was slow on both ends. As an example, it might take 20 seconds for BrainTree to respond with the XML for the request and then another 28 seconds(!) for the BrainTree gem to parse that response. My client’s server was set to issue a timeout after 45 seconds, so you can see how this was a problem (besides the fact that we wouldn’t want the users to have to wait so long for a response).

As I dug a little deeper I discovered that the gem *should* use the speedy LibXML gem instead of the default REXML to do XML parsing. Unfortunately, we hadn’t installed LibXML. So I installed and configured LibXML but I still got the same results. So then I dug into the BrainTree gem’s code and discovered that there was a bug which was preventing it from finding LibXML.

The problem was a simple typo — the gem was looking for “::LibXml::XML” but it should have been checking for “::LibXML::XML”. See the difference? The ‘M’ and the ‘L’ need to be capitalized.

So I changed the gem’s code and ran my test again. This time it still took the same amount of time for BrainTree to send us the XML response but the parsing only took 2 seconds instead of 28 seconds.

I’ve submitted a pull request to BrainTree for this fix. You can see my commit here.

Posted in Technology | Tagged , | Leave a comment

Calculating Standard Deviations in Ruby on Rails (and PostgreSQL)

I need to calculate some Bollinger Bands (BBs) for SwingTradeBot, which is built in Rails 4. Here’s a quick definition of Bollinger Bands:

Bollinger Bands® are volatility bands placed above and below a moving average. Volatility is based on the standard deviation, which changes as volatility increases and decreases.

So I needed to do some standard deviation calculations. I found a few Ruby gems that allow you to do statistics but I quickly ran into issues with them. The general approach of the gems is to monkey patch Array and/or Enumerable, which can cause other issues. I was getting conflicts with ActiveRecord b/c of the monkey patches redefining “sum” and there was another conflict with a different gem that I tried. There are supposedly fixes for this stuff but it just felt dirty.

Then, as I often do, I wondered if I could just get the database to do the calculation for me. If so, it would be faster that way and I wouldn’t have to go monkey patching Ruby and/or clutter my app with my own standard deviation code. It was a pretty simple thing to have PostgreSQL do the calc for me. I just needed Rails to produce a query like this:

SELECT stddev_pop(close_price) FROM prices
WHERE (stock_id = 3313 and day_number > 195 and day_number <= 215)

Seems simple enough. So here's the Rails code to do just that:


result = Price.select('stddev_pop(close_price)').where("stock_id = #{stock_id} and day_number > #{day_number - 20} and day_number <= #{day_number}").load #Note that I couldn't do ".first" on the line above b/c that creates an ORDER By clause that PostgreSQL complains about b/c the column being ordered by is not in the GROUP clause... standard_deviation = result.first.stddev_pop self.upper_bb = twenty_day_moving_average + (standard_deviation * 2) self.lower_bb = twenty_day_moving_average - (standard_deviation * 2)

Done!

Posted in Technology | Tagged , , | Leave a comment

Ruby / Rails Memoization Gems Memoist vs. Memoizable

I was just adding some memoization to a Rails app and I was exploring the available gems. I’d used Memoist in the past on another project but I couldn’t remember why I chose it over other gems.

While researching today I found the Memoizable gem and thought that it looked pretty good. It has all these nice badges on the GitHub page, like a CodeClimate score of 4.0. So I figured I’d go with Memoizable.

After installing it and memoizing some methods I realized why I went with Memoist in the past. Memoizable won’t let you memoize methods that take parameters. If you try to do so it will complain loudly with “Cannot memoize Class#method_name, its arity is 1”.

That was a non-starter for me. I switched to Memoist and all is well.

Posted in Technology | Leave a comment

I’ve Finally Found a Rails 4.x Blogging Engine / Gem

I can’t believe how difficult it’s been to find a good solution for plugging a simple blog into an existing Rails app. I wanted to add a blog to SwingTradeBot, the new site I’m building but most answers to this question that I’ve found say to either use RefineryCMS or “roll your own. Well I tried Refinery and quickly ran into gem conflicts galore. As for rolling my own… I don’t have time for that — I’d rather use something that’s been thought through and is well suited to the task.

I was ready to give up and just roll my own when I found the Monologue gem. That looked really promising but then I ran into a Rails 4 compatibility issue. However, reading through the discussion thread on that issue I discovered that somebody had created the Blogo gem (plugin / engine).

It’s still early days with this gem but so far, so good for the most part. Installation and set-up went smoothly (in development mode). Here are some things I ran into after pushing to production (on Heroku):

  1. There’s a generator to create the admin user (rake blogo:create_user[user_name,user@email.com,password]) – that didn’t work in production. I didn’t find out until after creating the user manually in a Rails console that I needed to prepend ‘RAILS_ENV=production” to the rake command.
  2. The assets were missing. running “RAILS_ENV=production rake assets:precompile” fixed that.
  3. Note that for comment to appear you need to be signed up for Disqus and you need to enter your site’s short name into the Blogo config.
  4. There are some configuration options that I had to discover via digging through the code. See below for an example of what I’ve added to my config/application.rb

Here’s what’s in my config/application.rb:


Blogo.config.site_title = "SwingTradeBot Blog"
Blogo.config.site_subtitle = "Some clever subtitle..."
Blogo.config.keywords = 'stock trading, technical analysis, stock scanning'
Blogo.config.disqus_shortname = 'swingtradebot'
Blogo.config.twitter_username = 'swingtradebot'

Posted in Blogging, Technology | Tagged , , | Leave a comment

Follow Your Favorite NFL Team on Your iPad in Flipboard

With N4MD’s new NFL coverage it’s simple to stay up-to-date on your favorite pro football team on your iPad. Simply add your team to your Flipboard favorites and you’ll be informed of all the important team news all season long.

Here’s how to add your team to Flipboard:

  1. Launch Flipboard and click the “+ More…” box or on the “More…” in the red ribbon in the upper right corner.
  2. That will open the “Add Content” page. This is where you can search for your team’s magazine. Type the appropriate search term for your team:
    – Type FanMag_Cards for the Arizona Cardinals.
    – Type FanMag_Bucs for the Tampa Bay Buccaneers.
    For all other teams type FanMag_YourTeamName. For example, FanMag_49ers, FanMag_Steelers, FanMag_Cowboys, etc.

    Then just tap the magazine which will appear in the search results (see the red arrow in the image below).

  3. The final step is to add that magazine to your Flipboard favorites. Do that by tapping the “Add” button at the top of the screen.
Posted in Internet, Sports, Technology | Leave a comment

Quentin Tarantino, the Master Remixer

I’ve enjoyed the “Everything is a Remix” video series. As an old-school hip-hop fan, I’ve always enjoyed figuring out the origin of samples used in any given track. I guess that’s why part 2 of the remix series, which covered remixing in movies, was so interesting to me. That video touches on some of Quentin Tarantino’s work and then directs viewers to check out another video which goes into depth on Tarantino’s considerable theft reuse of ideas from earlier movies. I had no idea that Quentin “remixed” so much material for Kill Bill. Check it out:

Posted in Entertainment, Movies | Leave a comment

Back Online!

Just a test post. I’m rebuilding my web presence after selling Trader Mike in January. So it’s time to resurrect this old blog which I’ve neglected for years…

Posted in Blogging | Leave a comment

Lack of Indexes on Ultimate Tag Warrior Tables

Over the last week or so I’ve been on a mission to improve the performance of my web server, and especially MySQL. I took Arne’s advice and turned on the query cache. That helped but I still needed to do more. After doing some research I discovered MySQL’s slow query log, which does exactly what it sounds like. I enabled slow query logging and set “long_query_time” to 5 seconds. Shortly after I restarted MySQL the slow query count started to rise.

Every query in the slow query log was sent from the Ultimate Tag Warrior WordPress plugin which I use on my other blog. Here are some of the queries:

SELECT count( p2t.post_id ) cnt
FROM wp_tags t
INNER JOIN wp_post2tag p2t ON t.tag_id = p2t.tag_id
INNER JOIN wp_posts p ON p2t.post_id = p.ID
WHERE post_date_gmt < '2007-03-08 21:49:06' AND ( post_type = 'post' ) GROUP BY t.tag ORDER BY cnt DESC LIMIT 1 ;

and

SELECT tag, t.tag_id, count( p2t.post_id ) AS count, (
(
count( p2t.post_id ) /3661
) *100
) AS weight, (
(
count( p2t.post_id ) /1825
) *100
) AS relativeweight
FROM wp_tags t
INNER JOIN wp_post2tag p2t ON t.tag_id = p2t.tag_id
INNER JOIN wp_posts p ON p2t.post_id = p.ID
WHERE post_date_gmt < '2007-03-09 02:27:39' AND ( post_type = 'post' ) GROUP BY t.tag ORDER BY weight DESC LIMIT 50 ;

That led me to take a look at what was going on with the wp_tags and wp_post2tag tables. I did EXPLAINs on the queries and saw that they were doing table scans instead of using the indexes. So I went to look at the table definitions and was surprised at what I saw. The only index on the wp_post2tag table was rel_id, the auto-incremented primary key. So the columns that were actually used to do joins with, tag_id and post_id, had no indices. My SQL is very rusty but I knew that wasn’t a good thing. I also took a look at the wp_tags table and saw that it only had an index on the tag_id column. I’ve seen some queries with “tag = ‘tag_name’ ” in the WHERE clause so I figured that it would be good to have an index on the tag column as well.

After consulting with my brother, whose SQL skills are much more up to date than my own I decided to add indexes to those tables. I created an index called ‘tags_tag_idx’ on the wp_tags.tag column. On the wp_post2tag column I created two indexes — the post2tag_tag_post_idx index is on tag_id then post_id and the post2tag_post_tag_idx index is on post_id then tag_id. I’m not sure if using concatenated indexes is better than just creating separate single column indexes for each column but I think it’s the way to go after discussing with my brother and looking at how the wp_post2cat and wp_linktocat tables are indexed. They both have concatenated indices.

I ran some queries on the tables before and after to see if things were sped up and indeed they were. Unfortunately when I ran the EXPLAIN on the queries in the slow query log I saw mixed results. The keys that I added were now showing up as “possible_keys” and the actual keys but the queries still ended up doing table scans. For the tags table the EXPLAIN shows the dreaded “Using temporary; Using filesort”.

So while I didn’t completely solve my slow query problem the new indexes do help for many of the simpler queries which access wp_post2tag and wp_tag. If you’re using Ultimate Tag Warrior and are concerned about your database load you may want to add some indexes to the tag tables.

Posted in Blogging, Technology | Leave a comment

My Top 20+ Movies

In answer to Trader X’s question, here are some of my favorite movies. There’s no way I can rank them beyond maybe the first three. Nor could I stop at just twenty so with the help of my historical rankings on NetFlix I’ve gone 40 deep. Depending on my mood, any of these could be in the top 20:

  • The Shawshank Redemption
  • City of God (Cidade de Deus)
  • Trading Places
  • A Fish Called Wanda
  • O Brother, Where Art Thou?
  • Clear and Present Danger
  • Friday
  • Malcolm X
  • The Matrix
  • The Devil’s Advocate
  • Rabbit-Proof Fence
  • Memento
  • Get Shorty
  • Rush Hour
  • Brown Sugar
  • The Sixth Sense
  • Austin Powers 1
  • There’s Something About Mary
  • Buena Vista Social Club
  • The Fifth Element
  • Desperado

Honorable Mention:

  • Training Day
  • The Usual Suspects
  • Misery
  • Pulp Fiction
  • Amistad
  • Analyze This
  • As Good as It Gets
  • Better Than Chocolate
  • Black Hawk Down
  • Blade
  • Casino
  • The Game
  • GoodFellas
  • Heat
  • Set it Off
  • Sling Blade
  • The Thomas Crown Affair
  • The Untouchables
  • Donnie Brasco
Posted in Movies | 6 Comments

Archived for Posterity: Kenneth Eng’s ‘Why I Hate Blacks’ Article

Just thought I’d archive some (more) ignorance:

This is a copy of the controversial opinion piece by Kenneth Eng in Asian Week magazine:

Here is a list of reasons why we should discriminate against blacks, starting from the most obvious down to the least obvious:

• Blacks hate us. Every Asian who has ever come across them knows that they take almost every opportunity to hurl racist remarks at us.

In my experience, I would say about 90 percent of blacks I have met, regardless of age or environment, poke fun at the very sight of an Asian. Furthermore, their activity in the media proves their hatred: Rush Hour, Exit Wounds, Hot 97, etc.

• Contrary to media depictions, I would argue that blacks are weak-willed. They are the only race that has been enslaved for 300 years. It’s unbelievable that it took them that long to fight back.

On the other hand, we slaughtered the Russians in the Japanese-Russo War.

• Blacks are easy to coerce. This is proven by the fact that so many of them, including Reverend Al Sharpton, tend to be Christians.

Yet, at the same time, they spend much of their time whining about how much they hate “the whites that oppressed them.”

Correct me if I’m wrong, but wasn’t Christianity the religion that the whites forced upon them?

• Blacks don’t get it. I know it’s a blunt and crass comment, but it’s true. When I was in high school, I recall a class debate in which one half of the class was chosen to defend black slavery and the other half was chosen to defend liberation.

Disturbingly, blacks on the prior side viciously defended slavery as well as Christianity. They say if you don’t study history, you’re condemned to repeat it.

In high school, I only remember one black student ever attending any of my honors and AP courses. And that student was caught cheating.

It is rather troubling that they are treated as heroes, but then again, whites will do anything to defend them.

Here’s some follow up on the situation: Asian paper’s ‘I Hate Blacks’ column assailed

Posted in Current Events, Race | Leave a comment