One Unified Global Perspective
Communications with a Global Perspective
Home
Intro
Contact Us
Voice over IP
PBX Solutions
Services
Support
Glossary
Open Source
Blog
Forum

WebMail





2008 May 31 - Sat

Decision Trees, Automated Trading, Simulations, and Strategies

A paper called Stock Picking via Nonsymmetrically Pruned Binary Decision Trees by Anton V. Andriyashin discusses a method for picking stocks for inclusion in a portfolio. By integrating technical analysis with binary decision trees, the author indicates that "BNS clearly outperforms the traditional approach according to the backtesting results and the Diebold-Mariano test for statistical significance", where BNS is Best Node Strategy. David Aronson of Evidence Based Technical Analysis fame may call the use of some the technical indicators as 'so much snake oil', the paper, at its heart, does describe a methodology for selecting a potentially profitable portfolio if one can use alternate forms of trading signals.

Alternate forms of decision tree based automated trading can be found in two papers by German Creamer and Yoav Freund called Automated Trading with Boosting and Expert Weighting and A Boosting Approach for Automated Trading. These represent algorithms used in the Penn-Lehman Automated Trading Project. Anyway, the two papers get down and dirty with some of the indiators they use in their trading simulation. Their bibliography references a number of good sources of information.

In the PLAT paper, here are a few strategies worthy of further investigation:

  • Case-based reasoning applied to the parameters of the SOBI strategy (see text for SOBI description).
  • Predictive strategy using money ow (price movement times volume traded) as a trend indicator.
  • Market-maker that positions orders in front of the nth orders on both books.
  • Mixture of a Dynamically Adjusted Market-Maker which calibrates by recent volatility, and a trendbased predictive strategy.
  • Sells on rising prices, buys on falling prices.
  • Trades based on relative spreads in the buy and sell books, interpreting small standard deviation as a sign of codence.
  • Simple predictive strategy using total volumes in buy and sell books.

Peter Stone's group has done well with the PLAT simulations. His papers, with this one as a example, Two Stock-Trading Agents: Market Making and Technical Analysis have many good implentable ideas for an automated trading strategy. Outside of the world of finance, general algorithmic bidding and optimization strategies are described in The First International Trading Agent Competition: Autonomous Bidding Agents. Another interesting Peter Stone paper called Designing Safe, Profitable Automated Stock Trading Agents Using Evolutionary Algorithms They discuss the concept that common trading rules have weaknesses under various trading conditions. By identifying the conditions, and adaptively switching among rules, trading results can be improved. One more Peter Stone supported effort is the poster: Safe Strategies for Autonomous Financial Trading Agents: A Qualitative Multiple-Model Approach.

Through the use of evolutionary reinforcement on data to which us mere mortals have no access, M.A.H. Dempster has a number of related papers. The bibilographies may be good sources of further inspiration:

In a sort-of-related paper, Robert Almgren and Julian Lorenz provide an insight into Adaptive Arrival Price. A couple of extracts from their abstract:

  • Electronic trading of equities and other securities makes heavy use of .arrival price. algorithms, that determine optimal trade schedules by balancing the market impact cost of rapid execution against the volatility risk of slow execution.
  • We show that with a more realistic formulation of the mean-variance tradeoff, and even with no momentum or mean reversion in the price process, substantial improvements are possible for adaptive strategies that spend trading gains to reduce risk, by accelerating execution when the price moves in the trader.s favor.

Now for a really un-related paper: A market-induced mechanism for stock pinning. The authors suggest that some stock prices can be pinned at strike prices on option expiration dates. As various market participants cover their positions with options and the related underlying securities, some interesting market dynamics unfold.

[/Trading/ReadingMaterial] permanent link


The Joy of Volatility

I initially had this embedded in my follow on article, but I think the information in this paper bears further scrutiny and testing, in regards to what could be classified as what I think is called pairs trading. I guess the secret is in the selection of the pairs.

The paper is by Dempster/Evstigneev/Schenk-Hoppé, and called The Joy of Volatility. They take a coin flipping strategy to picking a couple of assets. They show that the volatility is a positive benefit to portfolio profitability in a dynamic rebalancing strategy versus a buy and hold mentality. A couple of key quotes though:

Poverty is the inevitable fate of the passive investor.

Consider making an investment according to a simple active management style: buying or selling assets so as to always maintain an equal investment in both. On average, wealth will double in 80 periods and grow without limits. This investment style rebalances wealth according to a constant proportions strategy. It succeeds, where buy-and-hold fails, because of the volatility of asset returns.

However, as with any investment advice, a word of caution is in order: Constant proportions strategies do well in the long term but, over short time horizons, their superior performance cannot be guaranteed!

[/Trading/ReadingMaterial] permanent link


2008 May 29 - Thu

Evaluating Inter-Process Communication Frameworks

I'm reposting some comments regarding IPC frameworks that I made to the Boost-Users listserve today. It is in response to someone making unsubstantiated remarks regarding the relative merits of ACE and Boost, and another looking for some substatiated remarks. What follows are some substantiated remarks, based upon my personal experience with it and several other libraries.

I've started working on a number of distributed system projects. As a consequence, I started looking for distributed system libraries. References to ACE were most pervasive. I implemented a number of trial applications with the library. That was after plowing through relevant sections in the three primary ACE reference books. That was a good learning experience, if only to find out the various patterns in distributed architecture definition. I had the inter-process/inter-server communications (which only sent simple stuff) working well within ACE's Acceptor/Connector framework. ACE has a number of other patterns one can use. I was really impressed with the fact that the examples I used from the books worked as advertised, and I was able to bend them to my will.

ACE is based upon an interaction of classes, macros, and templates. One has to spend some time with the environment in order to become proficient with it. It has a large API. A number of lower level API's upon which higher level API's are based. For example the Acceptor/Connector uses constructs described earlier in the books.

Once I had my basic communications going, I realized I needed to get some form concrete messaging infrastructure in place. I had an impression that TAO, which is a layer above ACE, would be quite extravagant to implement, with it being an implementation of the CORBA specification. I wanted something a little lighter (a whole lot lighter actually).

As I worked through that project, I started hearing about ASIO, indirectly through some other libraries I was using. ASIO is now a member of Boost. I read a review somewhere that ASIO is a 'modern' replacement for ACE. If you want to get into real template structures and Boost oriented philosophy, I'd say that is a valid statement. I'd also say that ASIO is 'more to the point' and straight forward than is ACE, at least for the things I want to accomplish. But like ACE, ASIO is the basic communications infrastructure, no real messaging capability, which is what distributed computing is all about. ASIO turned out to be a little harder to get my head wrapped around as it uses a number of advanced C++ and Boost related idioms. For a run-of-the-mill C++ programmer, ACE would be better. For someone steeped in the power and obscurity of advanced C++, and is looking to advance their skill set, ASIO would be better.

I came across RCF - Interprocess communication for C++, which is a messaging framework riding atop of ASIO. Flexible, lightweight, and to the point. I worked through the examples and things worked as advertised. It has the encryption, publisher/subscriber, oneway/twoway idioms, and a few other nifty features.

At the same time I was doing that, seemingly coincidently, I learned a few more interesting facts. Going into this, I realized that I need a message dispatcher/proxy, some decent failover techniques, and some additional event handling for non-IPC related activities.

Someone suggested ICE from www.zeroc.com for an RCF-like solution, but working to a larger scale. I've heard that the library's originator is someone who spent much time on CORBA standards and redid the concept without the 'benefit' of committee involvement. I think the library has all the bases covered in terms of lightweight message handling, dispatching, resiliency, and higher level distributed processing philosophies. The drawback is that it will have a steeper learning curve than would an implementation using RCF. I like RCF, but I think I.m going to have to tilt towards ICE (itself, like RCF, developed and focused towards C++ in a multiple license environment).

On the non-IPC front, Qt's QCoreApplication looks to be a good substrate on which to build event driven daemons.

In the end, I think my solutions are going to involve:

  • ZeroC's ICE for primary inter-process communications
  • Qt QCoreApplication as a base for daemon development (which has built-in stuff for threads/locks, slots/signals)
  • Wt, a C++ based web toolkit for distributed GUI development
  • Boost Libraries to fill in all the holes
  • a little legacy layer 3/4 ACE in one library I'm using, but with some work, I think I can convert the ACE stuff to ASIO

[/Personal/SoftwareDevelopment/CPP] permanent link


2008 May 28 - Wed

Put Me To Sleep Reading Material

Someone in some data provider's forum was making mention of doing order flow analysis in Excel through Interactive Brokers, and the person felt that they weren't getting enough data. Which is true, Interactive Brokers sends data based upon what is necessary for someone viewing a screen, not based upon some automated data hungry automaton looking to crunch full data feeds.

That got me to thinking and to reading more about order flow analysis. This gets in to market orders, limit orders, bid/ask spreads, order books, market makers, rational traders, uninformed traders, instantaneous impact of variable sized market orders, as well as whole raft of other micro-economic activity that comes with high frequency trading.

Marco Avellaneda and Sasha Stoikov and recently released a paper entitled High-frequency trading in a limit order book, with another version of the same thing here. They develop some interesting equations on determining a bid/ask spread in the midst of a moving market, based upon a market maker's inventory and risk capability. I'm wondering if that is what BATS does for their trading capability.

Karl Ludwig Keiber has a paper called Price Discovery in the Presence of Boundedly Rational Agents. In the paper, he discusses some market maker concepts and what they deal with. Momentum as well as mean reversion are discussed in the context of bid/ask spread and price discovery. There is a minor discussion regarding adverse selection during a transition from momentum to reversal trading on page 25 which may be of some value. The cross over between reversal and momentum is a weakness in my trading.

Bruce Mizrach has a paper called The next tick on Nasdaq. Although a recently published paper, he uses data from 2002. The paper goes into some history of market making, limit books, and how Nasdaq grew up. Some of his interesting observations:

  • This paper asks a surprisingly simple but neglected question: does the entire order book help predict the next inside quote revision?
  • Lillo and Farmer (2004) find that orders on the London Stock Exchange follow a long memory process.
  • Bouchaud et al. (2002), while analysing the Paris Bourse, found a power law for the placement of new limit orders and a hump shape for the depth in the order book.
  • Weber and Rosenow (2005) find a log linear relationship between signed market order flows and returns on Island.
  • I find, for example, that the number of bids or offers is more important than the quoted depth.
  • In general, I find that the bids (offers) away from the inside increase the probability of a down (up) tick.
  • The last result I obtain is that this volatility decreases with larger market capitalization and the presence of more market makers.
  • Traders call the market makers or ECNs that frequently appear on the inside market the .ax., and they claim that taking note of the ax's activity is informativey.
  • for example, the advice from the Daytrading University at http://www.daytrading-university.com/ samplesson4ways.htm. ..Even with the ECN routing that mm.s [market makers] use to hide their order flow, there.s still plenty of profitable trading to be had by correctly: (1) Avoiding buying when a major mm/ax is selling (e.g. if you see MSCO and MLCO both sitting on the inside ask you probably shouldn.t buy if their bid is three levels outside the market) and (2) .Shadowing. the ax.s buying/selling behavior, if you see that all else looks okay, e.g. no suspiciously strong ECN buying/selling on INCA/ISLD...
  • The presence of a particular participant does not by itself indicate that they are significant contributors to subsequent quote revisions though.
  • Looking more closely at individual participants, there are some interesting results. When ARCA takes the inside bid, the next tick is more likely to be a downtick than an uptick in 65 of 71 cases.
  • When ARCA takes the inside ask, there is an uptick in 63 of 73 instances
  • The effect of specific participants in the small cap market differs from the large caps. ARCA has a negative impact from the bid in all 41 cases in which it is statistically significant.
  • A vector autogression can be inverted into its moving average representation, and one can then compute impulse responses functions. In our model of trades and quotes, these have the interpretation of market impact functions, or the effect on stock returns of an unexpected buy order arriving into the market.
  • It can also be explained in an order driven market by what Biais et al. (1995) call the .diagonal effect. in which they observe that a limit order that improves the inside bid (ask) is more likely to be followed by another limit order which increases (decreases) the inside bid (ask). A similar diagonal effect for trades is present as well. The negative serial correlation in the small caps suggest that the quote revision process for that group can be explained without assuming informed traders,
  • As in many auction designs, additional buy (sell) side interest makes the next price change more likely to be an uptick (downtick). Biais et al. (1999) observe this behaviour even in an environment in which quotes are only indicative. Similarly, in the period in which quotes are firm, the authors find that additional depth on one side of the book helps predict the appearance of additional liquidity on the same side of the book.
  • The number of buyers and sellers, I find, is almost always more important than quoted depth.
  • Aggregate depth, either at the inside market, or as a weighted average of the demand curve, is also helpful, and this information is surprisingly persistent. In general, the results are more successful for large cap stocks than small caps.
  • Quotes away from the inside are generally not informative. Large numbers of buyers (sellers) at tiers away from the best bid (offer) are more likely to result in a downtick (uptick).
  • The model of trades and quotes presented also produces dynamic estimates of market impact. The impact of a buy order can be determined beyond its impact on the current spread. The estimates appear to vary sensibly with standard measures of liquidity.

I wonder if the above snippets could be coded as in an expert system.

In Relation between Bid-Ask Spread, Impact and Volatility in Order-Driven Markets by Wyart/Bouchaud/Kockelkoren/Potters/Vettorazzo, the BATS philosophy of infinitesimal market-making can be expressed in terms of spread and the instantaneous impact of market orders. They indicate that there is an empirical correlation between the spread and the volatility per trade. As mentioned in one of the other papers, they confirm that the main determinant of the bid-ask spread is adverse selection. They also confirm that volatility comes from trade impact. The paper has an extensive bibliography worth looking into. There is an interesting corrolary in the conclusion, namely that "when the volatility per trade is large, the risk of placing limit orders is large and therefore the spread widens until limit orders become favorable."

[/Trading/ReadingMaterial] permanent link


2008 May 25 - Sun

A Keyword Matching Algorithm

There are a number of well known algorithms out there for taking in a set of keywords and matching them against test. Aho and Corasick comes to mind, as does the Wu Manber algorithm (the latter I've implemented, and the code resides elsewhere on this site).

For another project, I didn't need something quite so fancy. Actually two projects come to mind. One is that I have a input comma separated value file which includes stock symbols, a description, and the associated exchange. I wanted to keep statitics on what is read in on an exchange basis. My first kick at the can on this was to implement a string look up table using from the Standard Template Library, with the name of the exchange being the lookup key. Another area I could use this, but with some modifications, is in longest substring matches when trying to do rate table lookups based upon country codes and area codes in voip based call control.

In another part of the file, to do some special computing, and in a fit of late night programming, I went so far as to implementing a multi-stage strcmp on each exchange to do something special for each.

I knew from the outset, that would be horribly inefficient, but I didn't have a nice algorithm at hand to do it with.

After being reminded of Aho Corasick in another context, I decided to take another stab at improving efficiency. I ended up implementing half of Aho Corasick's algorithm, the 'go' function. Since I'm always matching from the beginning of the string, and am not stepping through text, the exclusion of the 'fail' function works quite well, and keeps the code smaller.

At some point in time the code can be coverted to something using templates in order to accept various string types, various element matches, and return codes.

Since the 'fail' function isn't implemented, the lookup table can be updated dynamically without being rebuilt from scratch.

The library is quite easy to use. With AddPattern, add new keywords along with some sort of index or object. When using FindMatch with the target keyword, the appropriate object will be returned on a successful match. If no match is found NULL will be returned. A modification to the class would add a default object for use when no matches are found.

Here is the header file.


#pragma once

// Copyright (2008) Ray Burkholder
// Matching against multiple keywords

#include <vector>
#include <string>

class CKeyWordMatch {
public:
  CKeyWordMatch(void);
  virtual ~CKeyWordMatch(void);
  void ClearPatterns( void );
  void AddPattern( const std::string 
&sPattern, void *object );
  void *FindMatch( const std::string 
&sMatch );
protected:
  struct structNode {
    size_t ixLinkToNextLevel;  // next letter of same word
    size_t ixLinkAtSameLevel;  // look for other letters at same location
    void *object;  // upon match, (returned when 
keyword found)
    char chLetter;  // the letter at this node
    structNode() : ixLinkToNextLevel( 0 ), ixLinkAtSameLevel( 0 ), 
      object( NULL ), chLetter( 0 ) {};
  };
  std::vector<structNode> m_vNodes;
private:
};


Here is the code file:


#include "StdAfx.h"
#include "KeyWordMatch.h"

// Copyright (2008) Ray Burkholder
// Matching against multiple keywords

#include <stdexcept>

CKeyWordMatch::CKeyWordMatch(void) {
  ClearPatterns();
}

CKeyWordMatch::CKeyWordMatch(size_t size) {
  m_vNodes.reserve( size );
  ClearPatterns();
}

CKeyWordMatch::~CKeyWordMatch(void) {
  m_vNodes.clear();
}

void CKeyWordMatch::ClearPatterns() {
  m_vNodes.clear();
  structNode node;
  m_vNodes.push_back( node ); // root node with nothing
}

void CKeyWordMatch::AddPattern( 
              const std::string &sPattern, void *object ) {
  std::string::const_iterator iter = sPattern.begin(); 
  if ( sPattern.end() == iter ) {
    throw std::runtime_error( "zero length pattern" );
  }
  size_t ixNode = 0;
  size_t ix;
  bool bDone = false;
  while ( !bDone ) {
    char ch = *iter;
    ix = m_vNodes[ ixNode ].ixLinkToNextLevel;
    if ( 0 == ix ) { // end of chain, so add letter
      structNode node;
      node.chLetter = ch;
      m_vNodes.push_back( node );
      ix = m_vNodes.size() - 1;
      m_vNodes[ ixNode ].ixLinkToNextLevel = ix;
      ixNode = ix;
    }
    else { // find letter at this level
      //ix = m_vNodes[ ixNode ].ixLinkToNextLetter;  // already set
      bool bLevelDone = false;
      size_t ixLevel = ix;  // set from above
      while ( !bLevelDone ) {
        if ( ch == m_vNodes[ ixLevel ].chLetter ) { 
          // found matching character
          ixNode = ixLevel;
          bLevelDone = true;
        }
        else {
          // move onto next node at this level to find character
          size_t ixLinkAtNextSameLevel 
            = m_vNodes[ ixLevel ].ixLinkAtSameLevel;
          if ( 0 == ixLinkAtNextSameLevel ) {
            // add a new node at this level
            structNode node;
            node.chLetter = ch;
            m_vNodes.push_back( node );
            ix = m_vNodes.size() - 1;
            m_vNodes[ ixLevel ].ixLinkAtSameLevel = ix;
            ixNode = ix;
            bLevelDone = true;
          }
          else {
            // check the new node, nothing to do here
            // check next in sequence
            ixLevel = ixLinkAtNextSameLevel;
          }
        }
      }
    }
    ++iter;
    if ( sPattern.end() == iter ) {
      if ( NULL != m_vNodes[ ixNode ].object ) {
        std::runtime_error( "Pattern already present" );
      }
      m_vNodes[ ixNode ].object = object;  // 
assign and finish
      bDone = true;
    }
  }
}

void *CKeyWordMatch::FindMatch( const std::string &sPattern ) {
  // traverse structure looking for matches
  std::string::const_iterator iter = sPattern.begin(); 
  if ( sPattern.end() == iter ) {
    throw std::runtime_error( "zero length pattern" );
  }
  void *object = NULL;
  size_t ixNode = 0;
  size_t ix;
  bool bOpFound = true;
  bool bDone = false;
  while ( !bDone ) {
    char ch = *iter;
    ix = m_vNodes[ ixNode ].ixLinkToNextLevel;
    if ( 0 == ix ) {
      bDone = true;  // no more matches to be found so exit
    }
    else {
      // compare characters at this level
      bool bLevelDone = false;
      size_t ixLevel = ix;  // set from above
      while ( !bLevelDone ) {
        if ( ch == m_vNodes[ ixLevel ].chLetter ) {
          ixNode = ixLevel;
          bLevelDone = true;
        }
        else {
          ixLevel = m_vNodes[ ixLevel ].ixLinkAtSameLevel;
          if ( 0 == ixLevel ) {  // no match so end 
            bLevelDone = true;
            bDone = true;
          }
        }
      }
    }
    ++iter;
    if ( sPattern.end() == iter ) {
      object = m_vNodes[ ixNode ].object;
      bDone = true;
    }
  }
  return object;
}


[/Personal/SoftwareDevelopment/CPP] permanent link


2008 May 23 - Fri

A Half Hearted Day

Last night I got some chart software programming accomplished. I can now see bars, trades and quotes. Over the weekend my task to get some indicators on to them, particularily pivots, Bollinger Bands of two or three different time frames, volume historgrams, and a zig zag indicator. A little further down the road, the zig zag indicator will be used for 'snapping' trend/support/resistance lines in to place to help solidify some chart patterns.

I looked in on COIL again this morning. I got sidetracked watching it and didn't realize the rest of the market had opened. When I did notice what was happening, a lot of things went south. It was all well and good that I didn't do anything. There will always be another trading day, and hopefully for Tuesday I can have my basket trading in place.

That is, I'm hoping to finish off the order entry bit that talks to Interactive Brokers. In doing so, I can then finish the integration my order basket tracking. Each evening, I run three different stock selection filters and come up with a total of about 40 different symbols with associated entry parameters. If all goes well, I can do some semi-automated trading: ie let the computer get my entries in first thing in the morning, then I can monitor the profit curve and start setting stop-loss points to generate automated exits.

[/Trading/Diary/D200805] permanent link


RCF - Interprocess Communications for C++

For a couple of distributed computing projects, I've been trying to come up with a feasible and easy to use method for making applications talk to each other, whether they be on the same machine or across a network.

I started off doing some work with Douglas C. Schmidt's ACE: The ADAPTIVE Communication Environment. I plowed through ACE's three primary programming books to see what would be the best bit of the environment I would need. I ended up implementing a demo with the Acceptor - Connector framework, just to see how things worked.

I then started on thinking on the messaging structure and the event handling structures. ACE's mixture of macros and classes turned out to be a little overwhelming for what I wanted to accomplish.

During my stint with ACE, I started to use ASIO, from the Boost libraries. I was first introduced to ASIO through working with WT: WebToolKit. I used Wt as a frontend to a voip call sign in server.

The next step in the evolution is to present a real time call summary report to authorized management as the calls are authenticated, authorized, and accounted for from a Radius server. This means sending call detail messages from the Radius server to a central dispatch server, and then publish to active web clients (with the clients written with Wt).

As Wt uses ASIO for its underlying network communications, and I had read a remark somewhere that ASIO is the new improved ACE, I started to look into it as the mechanism for my inter-process communications. I even got a good chunk of messaging infrastructure written as was about to get it testing when I found it was all for nought.

I came across RCF - Interprocess Communications for C++. It is a library that has been in development for the last few years by a talented fellow by the name of Jarl Lindrud. The library has implemented all the stuff that I only dreamed about doing: publish/subscribing, stream encryption, payload filtering, and any number of other nifty features.

I had a few painful moments in getting the library built. After a couple of messages back and forth to the author, I realized I was trying to build the whole thing into a static library rather than using an 'include' technique to get the platform specific files built.

The client and server examples built and ran without a hitch. I must admit that I was impressed by the examples in the ACE books as well: they compiled and ran with little or no messing about.

The RCF library is better because it deals with serializing native values back and forth, something that ACE only accomplishes when you get into the TAO and CORBA levels of the environment.

So now with Boost (which includes ASIO), RCF (which uses ASIO), and Wt (which also uses ASIO), I think I have all the interprocess tools I need to make my modules talk to each other. Now I can get on with the meat of my projects.

[/Personal/SoftwareDevelopment/CPP] permanent link


2008 May 22 - Thu

Trading Notes: 2008/05/22

I've been trading most days during the month of May. I've been using Interactive Brokers as a broker, and have been using their BookTrader to execute my trades. Regarding things I've learned while using the BookTrader, I'll leave that for another post.

My trading account (real money) is up by 9.4% since April 28, when I first started manual trading, and so far, knock on wood, I've had all positive days, some more positive than others, some a lot more work than others.

I think it is time to keep track of what I do and what I see so I can ensure I don't do the same mistakes more than once.

Limit orders is what I started with. Using a mostly contrarian strategy, I've been able to find some profit areas. I have been caught a couple of times when the market kept going in the wrong direction, and I was getting in deeper and deeper. Those were the rough days where I had to do tricky trading, and through mostly luck, the symbol recovered enough that I could end positive.

With that said, it is now time to figure out the price levels at which to do reversal orders. I'm setting up some charting to help me with that, and hope to have it done for trading next week.

The news over the last 12 hours has been heavy with the news of the large leap in oil (COIL), traded on IPE. I've been watching the 2008/July contract. That I traded with paper trading. The contrarian trading would have worked interestingly enough between 11:30 and 12:30 GMT, where it went from 134.25 down to 133.25. I lost my nerve and closed out half an hour into the decline, right at what turned out to be the bottom. It recovered and then some in the following half hour, to be back around 134.50 for a few minutes. I was thinking afterwards that I could have put Stops at various levels and caught it when it went back up, but thinking it was going to go back up was not really on my mind.

All in all, it was interesting to carry out a risky trade on paper just to see how things would have gone. It is easier to dispasionately analyze the results (monetarily and emotionally) than if that had been real money.

Update 10:05 AST. I saw COIL taking another dip, even lower this time. It went down to 132.50. This one, with real funds, I managed to work 18 trades in and out for a real profit of $643, after commissions, over five minutes.

Regular day trading accounts have a 4:1 margin ratio during the day, and an overnight carry margin of 2:1. On COIL, Interactive Brokers has a different margin structure. When you right click on the symbol and look for symbol details, it shows a multiplier of 1000. Which means each contract is worth 1000 times the BookTrader value. So if the ticker is at $133.23, you'll be buying a $133,230 contract. Margin for this is an initial margin of $9375 and an overnight maintenance margin of $7500. This gives over a 10:1 margin capability. The commission ended up being $2.02 per contract.

While writing this, it took another dip and fast recovery. Traders with deep pockets must be making good money on this.

Update EOD: Well, that was an exciting day. Instead of just closing out at the end of those trades, I stayed in for more, but found I didn't reverse when I should have. I lost what I made and now have to try it again. Smarter this time. Watch for the reverses and run with them instead of against them.

The instances where I've gone against them in the past worked out, they came back. Not this time. They kept on going.

Breakouts are good thing, if you've got them going in the right diretion. I really need to get my charting fixed tonight to show some of the patterns I've seen. The programming is happening tonight. I hope to have it ready for a try in the morning.

[/Trading/Diary/D200805] permanent link


2008 May 21 - Wed

Confusion by Committee

In reading Rob Weir's An Antic Dispoition blog today, he has a very cogent observation regarding committees:

I have a theory concerning committees. A committee may have different states, like water has gas, liquid or solid phases, depending on temperate and pressure. The same committee, depending on external circumstances of time and pressure will enter well-defined states that determine its effectiveness. If a committee works in a deliberate mode, where issues are freely discussed, objections heard, and consensus is sought, then the committee will make slow progress, but the decisions of the committee will collectively be smarter than its smartest member. However, if a committee refuses to deliberate and instead merely votes on things without discussion, then it will be as dumb as its dumbest members. Voting dulls the edge of expertise. But discussion among experts socializes that expertise. This should be obvious. If you put a bunch of smart people in a room and don't let them think or talk, then don't expect smart things to happen as if the mere exhalation of their breath brings forth improvements to the standard.

The quotation stems from his observations regarding the committee which was stick handling Microsoft's OOXML standard through the fast track process. Sometimes committees, when doing things properly, can be better than the sum of the parts, but without proper communication and time allotments, can turn out to be no better than the weakest link.

[/Personal/Business] permanent link


2008 May 05 - Mon

Reducing Traffic on High Cost Inter-ISP Links

AquaLab has released an open source plugin for BitTorrent clients, specifically Azureus. AquaLab's Ono Plugin's "main goal of this plugin is simple -- to improve download speeds for your BitTorrent client. "

Here is a press release summary I came across from ACM TechNews:

Northwestern University researchers have developed Ono, software that eases the strain that peer-to-peer (P2P) file-sharing services place on Internet service providers (ISPs). Ono allows users to efficiently identify nearby P2P users and requires no cooperation or trust between ISPs and P2P users. Ono, the Hawaiian word for delicious, is open source and does not require the deployment of additional infrastructure. When ISPs configure their networks correctly, Ono can improve transfer speeds by as much as 207 percent on average, the researchers say. Ph.D. student David Choffnes, who developed Ono with professor Fabian E. Bustamante, says Ono relies on a clever trick based on observations of Internet companies to find nearby computers. Content-distribution networks (CDN), which offload data traffic from Web sites onto their proprietary networks, power some of the most popular Web sites in the world, enabling higher performance for Web clients by sending them to a server close to them. Using the key assumption that the two computers sent to the same CDN server are near to each other, Ono can identify P2P users close to each other.

This aids two types of communities:

  • Users: who can get faster downloads because P2P peers are closer and are therefore prone to fewer errors and dropouts.
  • Service Providers: traffic can be kept off high cost inter-ISP links. With traffic kept internal, cost savings on carrier links could be realized.

On the negative side though, last mile links get more saturation with higher traffic densities. If one is on a shared cable modem or a shared wireless access point, ironically this isn't the best thing that could happen.

[/Personal/Technology] permanent link


2008 May 03 - Sat

Multi Touch Screens

In a recent issue of Technology Review, there is an article regarding Open Source Multi Touch Displays.

The technology is based upon taking an acrylic sheet, and projecting video onto the back surface. Around the edges are some infrared light emitting diodes focussed to emit the light into the sheet. The light bounces around on the inside from suface to surface.

When someone touches the panel, the light path is interrupted. An infrared sensitive camera on the back side can then be used to distinguish the touch locations. Simple and effective touch technology.

If someone could marry Lightfactory's new virtual layout generator on a multitouch board, suddenly lighting design and control would take on a whole new dimension.

Perhaps even using the the multitouch capability on the dance floor would introduce a whole new level of dance lighting interaction.

[/Personal/Technology] permanent link



New blog site at: Raymond Burkholder - What I Do

Blog Content ©2013
Ray Burkholder
All Rights Reserved
ray@oneunified.net
(519) 838-6013
(441) 705-7292
Available for Contract Work
Resume

RSS: Click to see the XML version of this web page.

twitter
View Ray 
Burkholder's profile on LinkedIn
technorati
Add to Technorati Favorites



May
Su Mo Tu We Th Fr Sa
       


Main Links:
Monitoring Server
SSH Tools
QuantDeveloper Code

Special Links:
Frink

Blog Links:
Quote Database
Nanex Research
Sergey Solyanik
Marc Andreessen
Micro Persuasion
... Reasonable ...
Chris Donnan
BeyondVC
lifehacker
Trader Mike
Ticker Sense
HeadRush
TraderFeed
Stock Bandit
The Daily WTF
Guy Kawaski
J. Brant Arseneau
Steve Pavlina
Matt Cutts
Kevin Scaldeferri
Joel On Software
Quant Recruiter
Blosxom User Group
Wesner Moise
Julian Dunn
Steve Yegge
Max Dama

2008
Months
May




Mason HQ

Disclaimer: This site may include market analysis. All ideas, opinions, and/or forecasts, expressed or implied herein, are for informational purposes only and should not be construed as a recommendation to invest, trade, and/or speculate in the markets. Any investments, trades, and/or speculations made in light of the ideas, opinions, and/or forecasts, expressed or implied herein, are committed at your own risk, financial or otherwise.