One Unified Global Perspective
Communications with a Global Perspective
Contact Us
Voice over IP
PBX Solutions
Open Source


2010 May 27 - Thu

Naked Market Orders and the Market Meltdown

At Security Industry News, Tom Steinert-Threlkeld suggests that naked market orders helped escalate the 'flash crash' and subsequent recovery on May 6. I gather the market-makers, who provide liquidity through limit orders couldn't handle the deluge. And I think that we still don't know what the hair trigger was that set off the deluge of sell orders.

I learned a new lesson today. The best way of submitting market orders, in order to get the trade, is to use limit orders to create 'collars' around the price of a stock to reduce risks in trades. To go along with this, use algorithms that expressly include risk controls.

Speculation is that the naked market orders were used by the less experienced: some smaller high-frequency traders and some semi- professional traders.

[/Trading/AutomatedTrading] permanent link

2009 Oct 26 - Mon

Machine Readable News and Algorithmic Trading

A-Team Research has released a special report called: Machine Readable News and Algorithmic Trading.

I've writing some code to accept a news release feed from DTNIQ/IQFeed. This report comes in handy for supplying some ideas on how to analyze and make use of the news feed. Here are some examples:

  • When generating trading signals for high frequency traders and other alpha-seekers, it can be used to build sentiment measurement applications, stock screening applications and back-testing systems for trading algorithms.
  • It can be used in support of market surveillance systems.
  • This translates into simple stock-screening applications for individual securities or lists of stocks.
  • It can mean the analysis of macroeconomic data to identify trends, correlations and other relationships.
  • It can involve scanning key parameters to measure market sentiment.
  • It could predict potentially volatile trading days, indicating which stocks or types of stock may be most affected.
  • It can also be used to quickly derive directional signals from the marketplace, and set in play appropriate trading algorithms.

[/Trading/AutomatedTrading] permanent link

2009 May 16 - Sat

High Performance Messaging

The most mention I hear of low latency trading is from data vendors who say their market data feeds are 'the best' because they are nearest the data source, and that their infrastructures have been designed for high availability and performance.

I've always thought though, that market data source adjacency forms only a portion of the overall delay budget. It seems to me that 'closeness' to the execution side of things is just as important, if not more so. This is confirmed through some articles I've recently seen that discuss some colocation facilities situated to optimally provide this 'betweenness', aka Smart Proximity Hosting.

The third aspect of low-latency trading resides within the compute engine, the engine that receives market data, calculates the trades, performs risk management, sends out the execution requests, and receives the execution confirmations. Copying data from and into packets as well as receiving and transmitting them can be a time consuming processing. Buffer management is a serious consideration in high frequency trading scenarios (the concept of high-frequency trading being intimately intertwinded with the concept of low-latency market data feeds).

I came across Topics in High-Performance Messaging in relation to someone's generic question about how to test throughput on links. Buffer sizing is one of many important topics in optimizing throughput and reducing latency. This paper makes obvious many of the hidden gotchas for the compute engine, the links (how many, what kind, and how they are joined), the feed types, and the supporting L2/L3 infrastructure. Even though I came across it as a generic response to throughput testing, I see it is written by a group that has spent much time on investigating low-latency issues in trading. I see the article as being very usful for optimizing additional milliseconds/microseconds out of the execution cycle time.

Another view on this low-latency issue arises in a blog entry from The Blog of James: Does the need to process volumes of data prohibit lower latency?

There is a news site dedicated to news regarding low latency trading issues:

[/Trading/AutomatedTrading] permanent link

2008 Jun 16 - Mon

Mean Reversion Thoughts

While still putting together the code for a trading solution, I've been thinking about what algorithms to implement for a trading strategy. I have access to live intra-day tick and quote data, so mean-reversion aka contrarian strategies seem like interesting candidates.

In the course of manual trading, I've learned that one needs to keep track of a number of items: current portfolio costs, current holding costs, existing profit/losses, expected market direction, current market location, external influences. This is a lot to do manually. Hence the desire to implment tools to automate, or even semi-automate the process.

A paper by Subramanian Ramamoorthy called A strategy for stock trading based on multiple models and trading rules discusses a state space mechanism for determining how to manage the portfolio composition. Another item he brings to the foreground is a description of the Sharpe Ratio, a ratio which helps one to keep profit consistent rather than widely dynamic.

Using different terminology, the makers of NeoTicker have a blog with an article called Counter-Trend Trading with Simple Range Exhaustion System. The key point, which could be hard to do, is "most counter-trend traders will try to time their entries as close to the extreme reversal points as possible to maximize the profits and minimize the risk exposures". Using multiple time frame charts, and reading the tape, along with some possibly helpful technical analysis tools, it might be possible to home in on the zones of reversal.

Working my way into a little scalping in the futures, an older article at Interactive Brokers explains the birth of the Dow Mini Futures. Some interesting points:

  • "try to identify the leader in a group and how its price movement can help us predict movement in others in the group"
  • "we start to trade it by hand so we can get a better understanding of the nuances in that particular trade"
  • "We have a trader and a programmer trade together for a while and then we start the process of automation. We define our risk parameters and write the rules that we feel give us an opportunity to be profitable."
  • "In our back testing we saw that if we were patient it would be profitable for us. The hard part was learning to be patient because our other successful trades were very high frequency. In the mini-sized Dow we may be in and out of 5 to 10 trades in a less than minute."
  • hedge the mini dow with the underlying basket of stocks
  • "We don't have scalping targets. We generate a theoretical value and make markets based purely on that value If we our pricing is accurate and we should naturally be able to scalp."
  • "In the Dow because the bid-ask spread is so tight most of our profits are generated from trading."
  • "he dow has a much tighter spread compared to the mini-spu. Also it is much easier to watch the stocks in the underlying basket to ascertain their effect on the future."
  • "The Russell tends to be trendier than other indices."

[/Trading/AutomatedTrading] permanent link

2008 Jun 15 - Sun

Adaptive Arrival Price

A keynote lecture at the April 7th Algorithmic Trading Conference in London was by Mr. Julian Lorenz of ETH Zurich. The abstract for his lecture reads as follows:

Electronic trading of equities and other securities makes heavy use of "arrival price" algorithms, that balance the market impact cost of rapid execution against the volatility risk of slow execution. In the standard formulation, mean-variance optimal trading strategies are static: they donot modify the execution speed in response to price motions observed during trading. We show that with a more realistic formulation of the mean-variance tradeoff, with no momentum or mean reversion in the price process, substantial improvements are possible by using dynamic trading strategies. We develop a technique for computing optimal dynamic strategies to any desired degree of precision. The asset price process is observed on a discrete tree with a arbitrary number of levels. We introduce a novel dynamic programming technique in which the control variables are not only the shares traded at each time step, but also the maximum expected cost for the remainder of the program; the value function is the variance ofthe remaining program. The resulting adaptive strategies are"aggressive-in-the-money": they accelerate the execution when the price moves in the trader's favor, spending parts of the trading gains to reduce risk. The improvement is larger for large initial positions.

I think I'll add 'arrival price algorithms' to my key word searches. The above extract was from a search on 'mean reversion trading system algorithms'.

[/Trading/AutomatedTrading] permanent link

2007 Sep 24 - Mon

Sentiment Indicators with Option Statistics

When I have the time, I've been spending it adding capabilities to my trading software. My current addition is an Option Watcher. Nothing to really trade, just some thing to watch the state a complete options list for the trading instrument in which I'm interested.

A while ago, or rather, a long while ago, I looked into trading options. That turned out to be a very complicated endeavor. I decided to set it aside and come back to it later. Now isn't quite the 'later' I was thinking about, but I've been keeping them in mind. An article by Jeff Neal from Optionetics expanded upon the recent thoughts I've been having with his article called OUTSIDE THE BOX: Option Statistics as Sentiment Indicators. Here are a few choice exerpts where he says things better than I can:

One of the best ways to get a handle on sentiment in a particular stock is to monitor the activity of option traders. For instance, monitoring and tracking option volume and option open interest changes can reveal important information in regards to the expectations of traders, as well as how they may be positioned.

Option volume when unusually high can often times can identify explosive moves and identifies for the trader just where the action is taking place.

To best forecast a directional change in the market, it is important to monitor the daily gyrations of open interest. The thinking is that small investors are typically on the wrong side of a rally, an unusual increase or decline in the open interest of put and/or calls often signal a change in directional bias. Usually an abnormal rise or decline in open interest sends a contrarian type signal to the sentiment trader.

[/Trading/AutomatedTrading] permanent link

2007 Sep 14 - Fri

Software Development, Coders, and C++ Libraries

I grew up with Assembler, Pascal, then C, then C++, then C#, and now I'm back to C++. I've found that C# makes things easier for graphical programming, but it feels sluggish when doing some computationally intensive things. I've since moved back to C++. Development time has increased on some stuff, but I think things are better, and I derive more pleasure from C++ development. And C++ has a rich heritage and a rich library universe. This entry goes through some interesting things I've found.

One of the first libraries I came across was the Boost Libraries. I believe I've written about these before. A few specifics of interest include Regular Expressions, a soon to be released Time Series, date/time operations, some geometry constructs, state machine tools, and, well, the list goes on.

A few days ago, in looking for sophisticated Web Application tool kit. Wt: a C++ Web Toolkit appears to fit that niche very well. It also handles Ajax like functionality.

To assist with web development and layout, Firebug: A Firefox Addon might be of value for page layout issues. Although it has nothing to do with C++, which is the main topic here, it does have to do with finding a viable solution for checking out web page design.

Earlier today, I came across dzone: fresh links for developers. It has a wealth of links to articles written by developers for developers, developers of all categories and skill sets. Doing a search on C++ comes up with quite a list of articles.

One of the links pointed to The Programmicon. This article is mostly game based, but gaming shares cross-functionality with many disciplines. This once had two links to resources regarding finance. I was first introduced to Multivariate Embedding Methods by Carol Alexander on page 405 of her book Market Models. Although she won a prize for best price predictor using a model with that concept, I havn't been completely sold on it's applicability. If I had time I'd try it out. However, a key part of embedding is nearest neighbor analysis. The Programmicon points to a site providing ANN: A Library for Approximate Nearest Neighbor Searching. It also points to TMV - Template Matrix/Vector Library for C++, something else upon which embedding algorithms are built. Embeddings are based upon chaos theory. The concept is to try to find self-similarity in continous time. When similarities are found, you've got a predictor. Easier to say than do.

dzone also re-introduced me to LUA: An Embedded Programming Language. Debian Administration discusses how to incorporate it in to C++. I'm thinking it might be useful for scripting signals in a network monitoring package or defining charts in a financial modelling solution, or performing information searches in text analysis tools, or performing event & signal handling in a Cricket grapher.cgi rewrite. IEEE Software has an 8 page article called Traveling Light, the Lua Way. Kind of related is Kepler: Lua based web development platform.

During a brief flirtation with Fuzzy Logic, where one needs to evalute line crossings and area calcuations, I realized Computational Geometry might be of use. The C++ library Wykobi might be of value for optimized algorithms. The Code Project discusses its use.

I'm currently 'enjoying' MFC based development. I'm wondering if, since I'm still at a relatively early stage, I should be using TrollTech's Qt: Cross-Platform Rich Client Development Framework.

From a Microsoft perspective, Somasegar's Weblog has an article on 'Visual C++ Futures'. There are more than 200 user comments summing up needs, wants, and desires in that universe.

[/Trading/AutomatedTrading] permanent link

2007 Sep 10 - Mon

Internet Information Analysis

In follow up to a previous post I did on news analysis, I came across Monitor110. They don't release much about how they do stuff, but they do release some information which sets the bar as to what can be done in terms of analysis of information found in various sorts of repositories found on the web.

[/Trading/AutomatedTrading] permanent link

2007 Sep 06 - Thu

News Analysis

I subscribe to DTN's IQFeed data streams. (If you'd like to sign up, let me know I'll do a referral for you.) Anyway, in addition to the usual equity, futures, and options feeds, they have a news feed. Each feed entry has a media source indicator, a headline, a list of associated symbols, and a index number for obtaining the story content.

I thought it might be an interesting project to process each incoming message for its symbol list and do some sort of key word analysis to see if one can get a 'mood' of the article. This might provide some interesting trading ideas for the day.

I don't have the time to do it right now, but am recording my thoughts so I can come back to it a little later.

Two recent articles by Paul C. Tetlock in the The Journal of Finance, one in the June 2007 issue titled "Giving Content to Investor Sentiment: The Role of Media in the Stock Market", and one in an upcoming issue called "More Than Words: Quantifying Language to Measure Firms. Fundamentals", got me thinking about this again.

One of the articles pointed to the General Inquirer, no, not a racy tabloid but a "a computer-assisted approach for content analyses of textual data". Although GI references an application useful for researches, I think the interesting content resides with the spreadsheet of categorized words they have. These words can be used to classify the 'mood' of processed text.

The site also points to a book called "The Content Analysis Guidebook" by Kimberly A. Neuendorf as one that might shed further background on the concept. A while ago, I was taking a look at content anlysis from a different perspective, something akin to classifying market analysis and trading blogs. Some additional book references are linked below.

An application called Yoshikoder is an already built application that can take the GI word lists and process portions of text and produce analysis summaries.

A brief web search brought up a couple of blogs that show some perspective on how to put analysis into perspective:

Some 'possibly' related books:

[/Trading/AutomatedTrading] permanent link

2007 Sep 03 - Mon

Linux, Wine, MFC, Win32 API

for the trading application I'm developing, I was thinking that I'd only be able to run it on a Windows machine due to the fact that a couple of vendor supplied libraries are only supplied as Microsoft Windows .dll's and MFC C++ libraries. Perhaps such is not the case any more.

For whatever reason, I recalled that Wine is a "compatibility layer for running Windows programs". They say they can handle WinSock32 calls, which is probably one of the primary hard things to do.

So I'm hoping I can take the supplied vendor .dlls, my MFC .dlls, and load them into the wine layer, and they'll run. As a result, I can make further use of some of my remotely hosted Linux servers for hosting my trading platform, without resorting to installing either VMWare editions or real Windows platforms.

While on the subject of MFC and such, I want to record a few Win32 API/MFC sites that will help in some of the code development:

[/Trading/AutomatedTrading] permanent link

2007 Aug 05 - Sun

Trading System Design Thoughts: Price - Volume - Time

I spent a couple of years using SmartQuant's QuantDeveloper (now owned by QuantHouse) to evaluate the viability of various technical analysis based trading systems. I had great success with tweaking simulations to make in-band solutions work, but when it came to using the developed scenario for out-of-band data, the attempted solutions became woefully inadequate.

In reading various books and blogs, I could see that people, when trading using traditional technical analysis tools, would spend much of their time on the look-out for new stocks with a potential for a large trend, whether the trend be up or down. I asked my self why does one need to jump from stock to stock to find trades? In effect, those traders are looking for directional volatility. As a corrollory, it would appear that they are unable to make money when markets go sideways (ie, don't trend one way or the other).

The people looking for trends will always have market scanners running in order to find the 'hot stock' for the day. Depending upon the sensitiviity of the scanner, much of the trend to be found could have already been run, with very little left to go. One really needs to be in on the ground floor, but those opportunities are few and far between.

When looking through Amazon book lists for traders, all one really sees are books based upon chart analysis, technical indicators, and stock selection. During my initial research into trading, I did in fact obtain a number of those books. But as already mentioned, I became dissillusioned with what they had to say. I couldn't really put my finger on the answer to the question of why. Some good, solid, statistically validated answers became apparent once I obtained "Evidence Based Technical Analysis" by David Aronson. He basically proved what I had finally learned: a lot of published techniques are only so many words on paper. (I think I ranted about this once before, come to think of it).

While looking at equity trading, I also did a bunch of research into options trading. Good options traders know all about volatility, and how to make use of volatility in selecting an appropriate options trading strategy. Because of the wide variety of options strategies, and my inexperience with making money in this realm, I decided to back off of options, and move back to equities.

As a side note, it is interesting to note that the authors (Chacko, Jurek, and Stafford) of a paper entitled "The Price of Immediacy", in a recent issue of Journal of Finance, "show that limit orders are American options", which is a nice segue into equities. (The article number is 4458.pdf).

During the transition back to equities, I came across J. Welles Wilder Jr.'s book called "New Concepts in Technical Trading Systems". He appears to be the one who introduced the Average True Range, which is a mechanism for measuring volatility.

With a better understanding of volatility, I set out to use this knowledge in trading equities. I created a stock screener to use end of day data to find equities with good daily volatility. From an absolute volatility perspective, GOOG always landed on the top of the selection list. But one needs to be well financed to trade there as it is currently in the $500 range. ICE turned out to be a good runner up with it being in the $150 range with good daily liquidity.

I havn't assimiliated all it's nuances yet, but Joseph E. Murphy, Jr.'s book "Stock Market Probability" has much to say on statistics and probability as it relates to stock movement. Although it covers mostly long term trading, it may be useful for intraday movements.

Content of "Bollinger on Bollinger Bands" by John Bollinger assisted much in terms of understanding and measuring volatility.

In relation to Bollinger Bands, I developed a peak detection tool to determine how often an equity changes trend direction in any given day. The relationship is that peaks will typcially relate to Bollinger Band edges, and point out new edges, so to speak. Since the peak detection tool provides peak determination in a real-time delayed fashion (yes, I know I could explain that better, but it sounded more interesting that way), it can't be used directly as a trading tool, but it does yield some interesting statistics in terms of average peak-to-trough runs and their average duration. On a volatile equity, one gets lots of peaks, some bigger than others. I've found that I should be able to focus on one or two stocks regularily, and begin to learn it's idiocyncracies, and as a result trade it profitably, even though it may, from a daily chart reader's perspective, be going sideways. It may be trading side ways over a period of days, but it will have lots of intra-day ups and downs.

This, in effect, is what Market Makers do: act as sources of liquidity to traders. They play the market on both sides simultaneously. They enter the market at the beginning of the day directionless, and attempt to end the day directionaless, that is either with no portfolio, or with a portfolio with evenly matched short and longs. In Option Maker's parlance, this is called ending the day with a zero delta.

You'd think that a book by the title of "The Market Maker's Edge", which in this case is written by Josh Lukeman, provide some details about market making and how to trade in that manner. Instead, the book has a decidedly technical analysis bent, with not enough on the higher frequecy perspective on trading. "The Nasdaq Trader's Toolkit" by M. Rogan LaBier does a much better job of introducing one to Level II data, and what is happening on the markets. But the book dates itself through screen shots using fractions rather than the current decimalized system.

As a book not necessarily devoted to Level II analysis, I did find "Mastering the Trade" by John F. Carter to be extremely helpful in finding out about various market relationships, including what to look for before the market opens. It also suggested ways to make use of the trin and tick indicators while the market is open. The book "The ARMS Index (TRIN)" by Richard W. Arms, Jr. provides much background on how this works, and is a very useful tool for helping in determining short term (intra day) market movement.

So, after having said all that, I've come to realize that 'it' is really all about short term (intra day) market movement. Can one make money from all the gyrations of the market? It comes down to statistics and probability: how often are trades within a range and how often and when do they do a range extension?

It comes down to evaluating price, volume, and time.

In using Interactive Brokers Trader Workstation interface, in particular with the BookTrader interface (otherwise known as the ladder interface), one can see the latest price, bid, and ask. When subscribed to Level II, the content of the Limit Order Book is also available. By clicking on the bid or ask column at a price level, one can quickly place Limit Order bids and asks in order to bracket price movement. As price moves, the Profit/Loss of the cumulative position is updated in another column. In addition, a tick histogram is available for determining popular price levels. I find the book trader easier to work with rather than the traditional side by side bid/ask Limit Order book.

About the time I found out how that works, and how effective it is for active trading, I came across a few threads in Elite Trader which discussed this as a 'Non Linear Trading' method. One contributor explained how he used two accounts to work both sides (the buy side and the sell side) of the market at once.

Since IB isn't/wasn't all that much into customer service or special requests, I scouted elsewhere for a broker who would be willing to set something like this up. Genesis Trading turned out to be easy to work with in this regard. They were able to set me up with two trading accounts that draw off the same fund account. The only drawback with them is that they are mostly equities, they don't do the miniDow (YM), which I've been paying attention to in one fashion or another recently.

As Genesis doesn't seem to offer the equivalent of IB's BookTrader for monitoring Price - Volume - Time, I did a quick prototype in SmartQuant's QuantDeveloper. Unfortuneately, Genesis' API is somewhat lean when tied to a .NET framework. Gensis, instead, has a robust C++ framework. And since I found the .NET libraries a bit slow, I'm currently involved in rewriting my prototype in Microsoft VC++ 2007. It 'feels' faster, and 'closer to the metal'. C# is good for building systems quickly, but one loses the feeling of 'getting dirty' when working with it.

During trial runs on the C# version, I found I was getting caught up in following the tick rather than keeping track of the big picture in order to bracket trade ranges and follow range extensions. I found I needed to see the 'forest for the trees'.

My Peak Detection module was supposed to help with that, but not as much as I hoped. I came across a technique known as the Market Profile. The Market Profile breaks a day into 30 minute slices. The trading range in each time slice is marked with a letter of the alphabet and then 'draped' over the predecessor time frames. This allows one to find where most of the market action is occuring. By bracketing the 70% range, it should be possible to pick up a bunch of good trades with relatively little effort.

There are two recent books, both by Dalton/Jones/Dalton. The older one is "Mind over Markets" and should be viewed first, as it introduces the concept. The newer, recently released one is "Markets in Profile", which builds further on the theory. My plan is build and process Market Profiles in real time so as to maintain a 'big picture' view of the trading day.

There are also significant online resources for Market Profile. Much of the initial research was performed by J. Peter Steidlmayer while at the Chicago Board of Trade. The CBOT has a good Market Profile resource area including a free downloadable handbook in the educational resources area. Cisco Futures has a tutorial on Value Based Power Trading, which shares some of the material from the CBOT manual. The tutorial can also be downloaded as a .pdf. They have more links at Value Based Trading Research page.

In one of these references, I came across a remark to the effect that people were having a problem with using the Market Profile for building multiple day strategies. Given that market research indicates that any day is a 50%/50% chance of going up or down, I can see why people would have this problem. I think this is another reason to not try holding multi-day positions. Each day should be treated separately. This becomes readily apparent when doing end of day recaps, and realizing that each day moved due to some different market stimulous.

At the CBOT site, there are two good introductory articles by Jack Broz: Trade by the Book - A Guide to Reading Order Flow and Reading Order Flow. The first uses the Limit Book side by side format, while the second shows the ladder format.

The ladder format is used by many trader applications, Ninja Trader and Button Trader are ones that come to mind immediately. However, by the look of them, they don't appear to handle two simultaneous trading accounts. Hence, my motivation for coming up with my own application.

Which brings me to the present. My trading software is almost tradable, as in I'll be able to place and cancel Bid/Ask limit orders in a ladder format quite soon. There is a bunch of supporting infrastructure to implement, but the hard bit has mostly been accomplished. I hope to provide a screen capture of it in operation soon.

The goal of TradeFrame, the name I've given the software, is to provide good perspectives on price - volume - time. At each price level, accumulated volumes and ticks are presented. It is able to provide limit order book depth. And through auxilliary charts, it will provide market statistics such as tick and trin.

Then, as time goes by, I hope to try adding in semi-automation. The ultimate goal will be to fully automate the process, but can only be done once I've got a good handle on the manual process.

[/Trading/AutomatedTrading] permanent link

2007 Aug 03 - Fri

Personal Co-location Registry

Paul Vixie hosts a Personal Co-location Registry. If you have a personal 1U server running, say, a trading program or some such, then looking for place for it could be as easy as looking at the site.

For trading action, where you're not trading quite enough to colo right in a market data source or broker, then setting up somewhere close to the action might be sufficient. I ended up working for a week near Wall Street last month and was walking down Broadway and passed by what is known as the Cunard Line Building, just up from the photogenic New York / Wall Street Bull. I didn't realize the significance at the time, but later found out that Telehouse operates a hosting facility on one of the floors of the building. Look for companies with rack space there if you want to get geographically real close to the action.

[/Trading/AutomatedTrading] permanent link

New blog site at: Raymond Burkholder - What I Do

Blog Content ©2013
Ray Burkholder
All Rights Reserved
(519) 838-6013
(441) 705-7292
Available for Contract Work

RSS: Click to see the XML version of this web page.

View Ray 
Burkholder's profile on LinkedIn
Add to Technorati Favorites

Su Mo Tu We Th Fr Sa
    2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30 31    

Main Links:
Monitoring Server
SSH Tools
QuantDeveloper Code

Special Links:

Blog Links:
Quote Database
Nanex Research
Sergey Solyanik
Marc Andreessen
Micro Persuasion
... Reasonable ...
Chris Donnan
Trader Mike
Ticker Sense
Stock Bandit
The Daily WTF
Guy Kawaski
J. Brant Arseneau
Steve Pavlina
Matt Cutts
Kevin Scaldeferri
Joel On Software
Quant Recruiter
Blosxom User Group
Wesner Moise
Julian Dunn
Steve Yegge
Max Dama


Mason HQ

Disclaimer: This site may include market analysis. All ideas, opinions, and/or forecasts, expressed or implied herein, are for informational purposes only and should not be construed as a recommendation to invest, trade, and/or speculate in the markets. Any investments, trades, and/or speculations made in light of the ideas, opinions, and/or forecasts, expressed or implied herein, are committed at your own risk, financial or otherwise.