Sunday, May 31, 2009

Two New User Groups on LinkedIn

I just created two new User Groups on LinkedIn.

1) Microsoft Complex Event Processing
This is for discussions surrounding the new Microsoft CEP (aka Orinoco) product.

2) Aleri
This is for the newly combined CorAleri product. We can discuss the legacy Coral8 and Aleri products, and discuss the parts of the CorAleri roadmap that are not under NDA.

If these groups interest you, please feel free to join.


©2009 Marc Adler - All Rights Reserved.
All opinions here are personal, and have no relation to my employer.

Saturday, May 30, 2009

The First First Derivatives/KDB User's Group

First Derivatives held it's first New York KDB User's Group meeting at LiquidNet in new York city on May 28th, 2009. About 100 people showed up, and it seemed that there was no more than one degree of separation between everyone in the room. Jacob Loveless did a great presentation on the things that he is doing at BGC/Cantor, and Dan Nugent showed a glimpse of a KDB unit testing framework.


©2009 Marc Adler - All Rights Reserved.
All opinions here are personal, and have no relation to my employer.

Wednesday, May 27, 2009

One Trader's Opinion about Algo Trading and CEP

I was recently talking to a very technical algo trader from another company about CEP. I asked him why he did not use a product like Apama or Truviso for his algo trading needs. His reply was that it did not give him a competitive advantage if he used the same technology as other competitors used. He has the budget and the resources to build out a custom platform, one which he thought would give him a distinct advantage.


©2009 Marc Adler - All Rights Reserved.
All opinions here are personal, and have no relation to my employer.

Monday, May 25, 2009

A Brief Look at LINQ and Microsoft Orinoco

This was taken from the Microsoft whitepaper on Orinoco. Thanks for the Orinoco team for providing this to me.


Let's assume that you have a number of power meters that you need to monitor. The "InputStream" contains a stream of tuples where each tuple contains a power reading from an individual meter. The reading happens to come in as the wattage times 10.


We can capture each reading a project a new stream that has the normalized wattage. We can also create a sliding 3-second window of meter readings that we can do calculations against.


var realValueStream =

from e in InputStream

select new MeterWattage((double)e.Consumption / 10);


var windowedStream =

from e in realValueStream.SlidingWindow(TimeSpan.FromSeconds(3))

select new MeterWattage(e.wattage);


We can then create a stream that has the average power reading over a sliding 3 second period.

var averagedStream =

from e in windowedStream

select new OneMeterAverage(CEPAggregates.Average(e.wattage));



(I think that this next query may be incorrect, as the ID field was not captured in the windowedStream, but I will continue with the example anyway).

If you want to get the 3-second sliding average for each individual meter, you can do the following 2 queries:

var meterStreamGroups =

from e in windowedStream

group e by e.id;


var groupedAveragedStream =

from eb in meterStreamGroups

from e in eb

select new MeterAverage(eb.Key, CEPAggregates.Avg(e.wattage));



Now, we want to compute the ratio of consumption for each meter over the 3-second window.


var sumStream =

from e in groupedAveragedStream

select new MeterSum(CEPAggregates.Sum(e.wattage));


var ratioStream =

from e1 in groupedAveragedStream

join e2 in sumStream on true equals true

select new

MeterRatio(e1.Id, e1.Wattage, e1.Wattage / e2.TotalWattage);



Finally, we just want to capture the ratio of power consumption from meter #2.


var filteredStream =

from e in ratioStream

where e.id == 2

select e;




©2009 Marc Adler - All Rights Reserved.
All opinions here are personal, and have no relation to my employer.

Saturday, May 23, 2009

A Strange Dream


I had a very strange dream last night. I dreamed that I left my current job and went to work for Google. When I got there, I found myself in a big loft, surrounded by 20-something year old geniuses. I asked them where my desk and computer was. They looked at me with puzzlement. I asked then where all of the computer monitors were, and they didn't know what a computer monitor was. I started describing how, on the trading floor where I worked, the traders has 6 or 8 monitors stacked on their desks. Then everyone started to laugh at me.

In seemed that everyone at Google worked on laptops. They had no conception of what a desktop system was, and the fact that a trader had multiple monitors seemed absurd. The Google-ites didn't even have desks. They just sat on the floor wherever they wanted with their laptops.

I think that the dream was inspired by an article that a friend just sent me.

And, it got me wondering if we at the large investment banks could be doing things in a better way .....


©2009 Marc Adler - All Rights Reserved.
All opinions here are personal, and have no relation to my employer.

Thursday, May 21, 2009

Orinoco Pricing vs IBM System S ???

The New York Times had a nice article on IBM's System S. You know that it's been a good week for CEP when the Times starts covering it.

IBM has really been rolling out the PR machine on this one. They have been going around to the various investment banks and talking a bit about the pilot that they did with TD Securities, which looks very much like an options market making app.

Jerome wants to know what I think of IBM System S. I hope to have a future blog post devoted to System S and their SPADE language.

Another reader pointed to a quote at the end of the Times article, and wanted to know why I thought that Microsoft should be "giving out Orinoco for free". The quote from the Times article is:

"Buying such an advantage from I.B.M. has its price. The company will charge at least hundreds of thousands of dollars for the software, Mr. Mills said."

This should not be such a shocking quote. Most CEP vendors charge several hundred thousand dollars for their software. But, IBM has been using System S somewhat as a showcase for their Blue Gene supercomputer, and IBM has not made a secret of the fact that System S works great on the grid computing infrastructure offered by Blue Gene. And, if you want to marry System S with Blue Gene, you will find yourself few million dollars lighter in the wallet. But, then again, a trading firm can make that money up with one good day of trading.

I just cannot picture Microsoft moving into the same pricing model as IBM yet. Orinoco has just started to explore the area of HPC. Orinoco hopes to be able to "process 100,000 events per second" while IBM is claiming to process several million per second. (Of course, what do these people consider to be an event? And, what do they mean by "process an event"? Does this mean complex calculations, outer joins, or does it mean just putting an event into the memory space of the CEP engine? We need Stac!!!)

Orinoco can be best positioned as an add-on to SQL Server, something that Microsoft has to do in order to compete with the new CEP offerings by Sybase and Oracle. Microsoft should be capturing the mindset of every .NET developer, working its way into a company starting at the ground level and permeating the organization upwards.

I can imagine that IBM does not want System S to appeal to the everyday developer ... just to the ones that work for companies and government agencies that have multi-million dollar budgets for real-time analysis. IBM says "If you have to ask about the price, then System S is probably too expensive for you." The actual cost of System S will be a fraction of what it will cost to purchase and deploy the proper IBM hardware and hire the reams of IBM consultants to set everything up and customize System S.


©2009 Marc Adler - All Rights Reserved.
All opinions here are personal, and have no relation to my employer.

Sunday, May 17, 2009

Wolfram Alpha - Maybe Overloaded?

Entering "IBM June Calls" into Wolfram Alpha freezes my Firefox browser.

Their system must be truly overloaded beyond their wildest expectations.

Update: I tried the "IBM June Calls" in IE8, and it did not hang the browser. Alpha did not completely understand what the query meant. I then typed in "IBM June 100 calls", and at least Alpha understood that I was talking about an option. I further refined the query by entering "IBM June 2009 calls".

Alpha tried to bring up a graph of the Option Payoff Function, with the price of the payout on the Y axis and the underlying price on the X axis, but the graph contained no data.

Looks like Alpha needs to be trained a bit more ....




©2009 Marc Adler - All Rights Reserved.
All opinions here are personal, and have no relation to my employer.

Experiment with Coral8 and T4

Scott was a big believer in a tool called CodeSmith to automatically generate code. In our group, he used it to take a Coral8 schema and generate serializers, deserializers, and something that we call "ServiceAgents". The problem was that Scott had the only CodeSmith license, so only Scott's machine was able to do the code generation. A few months ago, I had stumbled upon the T4 tool in a blog post, and I had asked Scott to switch over from CodeSmith to T4. Now, the code generation is done by a free tool that Microsoft includes in Visual Studio, so now everyone can experiment with automatic code generation.

A few weeks ago, I found an article in Visual Studio magazine about T4, and then I saw that Chris Donnan was experimenting with T4.

So, I decided to spend an hour last weekend and code up my own T4 Coral8 class generator. I did not bother to read the T4 manual, but I was impressed with what I could rig up in a very short time. This generates a class for every Coral8 CCS file in a certain directory.

This isn't world-class code .... I generated nullable types and I did not bother to generate null-checking code on the nullable fields. But, feel free to modify it.

I will certainly use T4 to generate classes and serializers/derserializers for Orinoco.




<#@ template language="C#" #>
<#@ assembly name="System.Xml" #>
<#@ import namespace="System.IO" #>
<#@ import namespace="System.Xml" #>
<#@ import namespace="System.Collections.Generic" #>


using System;
using Coral8.Transport;
using Coral8.Value;

<#
DirectoryInfo dir = new DirectoryInfo(@"v:\dev\CEP\C8Projects\Schemas");
foreach (FileInfo file in dir.GetFiles("*.ccs"))
{
string sSchemaFile = file.Name.Replace(".ccs", "").Replace("-", "");
List<Info> colInfo = new List<Info>();
using (StreamReader reader = new StreamReader(file.FullName))
{
XmlDocument xmldoc = new XmlDocument();
xmldoc.Load(reader);
XmlNodeList columns = xmldoc.GetElementsByTagName("Column");
foreach (XmlNode node in columns)
{
string colName = node.Attributes["Name"].Value;
string type = node.Attributes["xsi:type"].Value;
string netType = "unknown";
string c8FieldType = "unknown";

if (type.StartsWith("String"))
{
netType = "string";
c8FieldType = "StringFieldValue";
}
else if (type.StartsWith("Int"))
{
netType = "int?";
c8FieldType = "IntegerFieldValue";
}
else if (type.StartsWith("Long"))
{
netType = "long?";
c8FieldType = "LongFieldValue";
}
else if (type.StartsWith("Time"))
{
netType = "DateTime?";
c8FieldType = "TimeFieldValue";
}
else if (type.StartsWith("Float"))
{
netType = "double?";
c8FieldType = "FloatFieldValue";
}
else if (type.StartsWith("Bool"))
{
netType = "bool?";
c8FieldType = "BooleanFieldValue";
}
else if (type.StartsWith("Xml"))
{
netType = "string";
c8FieldType = "XmlFieldValue";
}

colInfo.Add(new Info(colName, netType, c8FieldType));
} // end foreach node
} // end using
#>

#region <#= sSchemaFile #> Serializer
public class <#= sSchemaFile #>
{
<#
foreach (Info info in colInfo)
{
#>
public <#= info.NetType #> <#= info.Name #>;
<#
}
#>
public void ToObject(Tuple tuple)
{
<#
foreach (Info info in colInfo)
{
if (info.C8FieldValue == "TimeFieldValue")
{
#>
this.<#= info.Name #> = new DateTime(((<#= info.C8FieldValue #>) tuple.GetFieldValue("<#= info.Name #>")).Value.Value);
<#
}
else
{
#>
this.<#= info.Name #> = ((<#= info.C8FieldValue #>) tuple.GetFieldValue("<#= info.Name #>")).Value;
<#
}
}
#>
}

public Tuple ToTuple()
{
Tuple tuple = new Tuple();
<#
foreach (Info info in colInfo)
{
if (info.C8FieldValue == "TimeFieldValue")
{
#>
tuple.SetFieldValue("<#= info.Name #>", new <#= info.C8FieldValue #>(this.<#= info.Name #>.Value.Ticks));
<#
}
else if (info.C8FieldValue == "StringFieldValue" || info.C8FieldValue == "XmlFieldValue")
{
#>
tuple.SetFieldValue("<#= info.Name #>", new <#= info.C8FieldValue #>(this.<#= info.Name #>));
<#
}
else
{
#>
tuple.SetFieldValue("<#= info.Name #>", new <#= info.C8FieldValue #>(this.<#= info.Name #>.Value));
<#
}
}
#>

return tuple;
}
}
#endregion
<#
}
#>
<#+
public class Info
{
public string Name;
public string NetType;
public string C8FieldValue;

public Info(string name, string type, string val)
{
this.Name = name;
this.NetType = type;
this.C8FieldValue = val;
}
}
#>


©2009 Marc Adler - All Rights Reserved.
All opinions here are personal, and have no relation to my employer.

Saturday, May 16, 2009

Wolfram Alpha

Just saw a screencast for Wolfram Alpha. This blows me away. Tons of ideas. How do we make it real-time?

If it can answer questions like "Show me all options whose buy price is under the fair value", then most of us are out of a job.


©2009 Marc Adler - All \Rights Reserved.
All opinions here are personal, and have no relation to my employer.

First Experiences with Java

I thankfully got the chance to do a little bit of coding over the past few weeks, working on an alerting services framework that is an extension of our CEP system.

Because we have a lot of systems that are in Java, I needed to deliver both a C# and a Java SDK. I used to do a tiny bit of Java back in the late 1990's, but I haven't touched it in over ten years.

So, I had a completely finished C# SDK, and I needed to port it over to Java. The C# SDK lets a system publish alerts over Tibco EMS and over WCF using the webHttpBinding. I knew that I was going to have an issue doing the Java interop to WCF. and, I wanted to do this entire effort without reading a lot of documentation.

I started off by downloading Eclipse. I created an empty Java project and started to port my C# code over. Immediately, I started to miss some of the features of C# and Visual Studio that I was used to.

1) Properties. Where are they in Java? Especially the automatic properties of C# 3.0. I don't want to have to write separate get and set methods for every property.

2) Events. That's a big miss for me. I had to end up using the Observable and Observer classes/interfaces. By the way, I used to the names of interfaces starting with the letter "I". It's not obvious in Java what is an interface and what is a class.

3) The ability to easily create Web Service proxy code. This is a no-brainer in Visual Studio. For the Java project, I ended up having to download and use the NetBeans 6.1 IDE (which is a lot slower than Eclipse). NetBeans was a nicer IDE. It understands Eclipse projects. It has a very easy way to create a Web Services client. I found out through a bit of pain and a lot of blogs postings that it is best for interoperability if your WCF service uses the basicHttpBinding instead of the wsHttpBinding or webHttpBinding.

So now, after all of these years, I can say that I have coded in Java. And, I can say that I have a greater appreciation of C# and Visual Studio.





©2009 Marc Adler - All Rights Reserved.
All opinions here are personal, and have no relation to my employer.

HP Notebook Crack - Part 2

The original HP case worker never called me back during the 24 to 72 hour time period that they promised.

After calling back again and going through two more levels of the "gatekeepers", I was transferred to India and assigned a new case worker, who promptly informed me that HP had not yet determined if my laptop fell into the same category as the other 200 models of HP laptops that had the hinge crack problems.

I ended up ordering a new front bezel and some other spare parts from the HP website. I will try to do the repairs myself.

The hinge crack issue, combined with the fact that I have an unbearably noisy fan (with no easy access), makes me think that I will never purchase an HP Laptop again.

I may go back to Dell after a hiatus of several years. The Dell laptops that I have had have been built like tanks.



©2009 Marc Adler - All Rights Reserved.
All opinions here are personal, and have no relation to my employer.

Friday, May 15, 2009

2 Papers on Microsoft CEDR

This technical report consists of two concatenated papers. The first is a theoretical paper describing interesting tradeoffs when designing speculative streaming operators. In addition, it describes specific algorithms with specific tradeoffs and the associated complexity analysis for the aggregate operator The second paper describes the other operator algorithms in the overall CEDR context and their associated complexity analyses. In all, the papers describe the necessary operator algorithms for building an efficient, computationally rich, speculative stream processing engine.



Event Correlation and Pattern Detection in CEDR
Roger S. Barga1 and Hillary Caituiro-Monge2
1 Microsoft Research, Microsoft Corporation,
One Microsoft Way, Redmond WA 98052
barga@microsoft.com
2 UC Santa Barbara, Computer Science Department,
Santa Barbara, CA
hcaituiro@cs.ucsb.edu



©2009 Marc Adler - All Rights Reserved.
All opinions here are personal, and have no relation to my employer.

Wednesday, May 13, 2009

CorAleri Update

We met with the good folks from CorAleri yesterday in order to get an update on their post-merger roadmap. Some of the presentation was under NDA, but all in all, things seem to be falling in a fairly predictable path.

Here are the highlights:

1) Merger going well. The dev teams from New Jersey and California have drank the mutual KoolAid. Nice to see Jon and Bob sitting next to each other, doing the photo-op. Seriously, its great to see two of the nicest companies join forces.

2) CCL with Splash is the new language.

3) Work is going on towards a new Eclipse-based IDE.

4) There will be a new engine that incorporates the best of both engines. Aleri is very committed to keeping low latency a big priority.

5) Support for debugging and profiling.

6) Most likely will use the Aleri adapters.

7) The Coral8 engine can now use Aleri's LiveOLAP, and the Aleri engine can now use the Coral8 Portal.


Some of this was announced publicly already by CorAleri, some of these are things that were requested by customers. Basically, no surprises ... just a clear roadmap for consolidation and evolution.


©2009 Marc Adler - All Rights Reserved.
All opinions here are personal, and have no relation to my employer.

Microsoft CEP - Part 2

Over the next few days, I hope to blog a bit about the new Microsoft CEP product, code-named "Orinoco".

To get yourself a bit more familiar with what was said at the announcement, you may want to go over to Richard Seroter's blog for a great post that summarizes the session at TechEd in which the Microsoft CEP product was announced.

I first got wind of Orinoco in the Spring of 2008 during conversations with some members of the SQL Server group at Microsoft. The Orinoco group maintained radio silence until January of this year, when I got an invitation to come out to Redmond for a Design Review Session. I was joined there by several internal and external Microsoft customers, and we spent 2 days going over Orinoco from front to back.

I will blog a bit more about the technology of Orinoco in some future blog posts. But, I would like to say that Orinoco has the potential to radically change the CEP marketplace.

Orinoco will use LINQ as the "surface language". This builds upon the good work of people like Kevin Hoffman, whose CLINQ is used heavily at Liquidnet. The ability to embed CEP operations within a normal C# application is extremely powerful. You can take advantage of the power of the C# language and you can go back to using mechanisms like inheritance and generics in your CEP applications. This approach combines the expressive power of something like Streaming SQL with the flexibility of something like Apama's MonitorScript.

I don't think that Microsoft has defined their final marketing strategy around Orinoco (although Charles mentions that it will be part of SS 2008 R2) . When I met with the Orinoco team in April, they were still gathering feedback from potential customers as to how to market the CEP technology. But, I can imagine that for the immediate future, Orinoco will be free to users of SQL Server. I can also imagine that some part of Orinoco will be free to Visual Studio users (like SQL Server Express) and that an Enterprise version will be licensed like SQL Server. Perhaps a free developer version will come with MSDN? Perhaps it will be a free add-on to SQL Server in the same way that Reporting Services is free. This can be used to drive more SQL Server sales, and put a damper on competitive products like Sybase RAP. It can also be a compelling factor for developers to choose Windows instead of Linux for their CEP servers.

This means that every .Net developer will have access to CEP technology at a price point that will make it extremely attractive.

CEP vendors like CorAleri, Streambase, and Apama will inevitably find themselves under a degree of pricing pressure. When the 2-ton gorilla bursts into the room with a free or low-cost offering, everyone has to sit up and take notice.

Orinoco is still a while away from being as polished as some of the commercial CEP products. But, the prospect of using LINQ, the integration with Microsoft HPC, and the ability to use Visual Studio as the IDE have gotten us pretty excited.

More later....


©2009 Marc Adler - All Rights Reserved.
All opinions here are personal, and have no relation to my employer.

Tuesday, May 12, 2009

Microsoft CEP - The Unveiling

This announcement is from the Microsoft TechEd site. If you attend this talk, please let me know your impressions. I have been under NDA about this for several months, so I have not blogged at all about the new Microsoft CEP product.

(I guess that the CEP team dropped the code-name and is calling this the All Data Platform ?)



DAT309 Low-Latency Data and Event Processing with Microsoft SQL Server
Presenter: Torsten Grabs
Tue 5/12 | 4:30 PM-5:45 PM | Room 151

While typical relational database applications are query-driven, event-driven applications have become increasingly important. Event-driven applications are characterized by high event data rates, standing queries, and millisecond latency requirements that make it impractical to persist the data in a database before it is processed. These requirements are shared by various scenarios across verticals such as manufacturing, oil and gas, power utilities, financial services as well as IT and data center monitoring. In this session we explore technology from the Microsoft SQL Server flight of products that covers these scenarios and addresses those requirements. The session discusses use cases in depth and explains how key components of the Microsoft All Data Platform can be used to build domain-specific solutions. As part of the session, we also go over various deployment alternatives and key dates of the product roadmap.


©2009 Marc Adler - All Rights Reserved.
All opinions here are personal, and have no relation to my employer.

Tuesday, May 05, 2009

Warning - HP Notebooks have a hinge-crack problem

http://www.notebookhingecrack.com/

I have a HP dv9700t laptop that I bought about a year and a half ago, and it has developed the hinge-crack problem. However, HP has not identified this notebook as being one of the models that have this problem (although practically every other HP Pavillion laptop has this problem). Therefore, after my call to HP tech support, they have refused to cover my laptop under their free-repair agreement.

When you call HP tech support, you are routed to a non-HP call center in Canada. Their job is to be the "gatekeepers" to the HP Case Managers, who are the people who determine whether your notebook should be repaired for non charge. After speaking to two levels of people at the call center, they have finally referred my problem to an HP case manager. They told me to expect a call anywhere from 24 to 72 hours.

I will let you know what happens.

HP Servers are used everywhere on Wall Street. It's amazing that HP has not been upfront about this major problem. The NotebookHingeCrack website has been very instructional in how widespread this problem is.

If you are in the market for a new laptop, I urge you to browse this website and know what you may be getting yourself into.


©2009 Marc Adler - All Rights Reserved.
All opinions here are personal, and have no relation to my employer.

Saturday, May 02, 2009

A Day in the Life of a Chief Architect

Three people won last night's $225M MegaMillions drawing, and none of the three was me. So, I guess that I have to go back to my day job on Monday!

As I mentioned in my last blog posting, I have become so screaming busy that I have not had as much of a chance to blog as I would like. The increased workload has been mainly due to my new title of Chief Architect of Equities, and the fact that I now manage a few other teams besides the Complex Event Processing team. I "back into" this position after the last reorganization at my company. When I first joined my company in August 2006, I was hired to work for the Chief Architect of Equities. He "left" the company in January 2008 (I was already out of his group by then), and another person took over the position, albeit in a less "pompous" capacity. This person only lasted one year, and in January of 2009, I was asked if I would like the position. Unfortunately, I am never one to say no, and here I sit as Chief Architect, waiting for a similar fate to befall me.

In a nutshell, the Chief Architect position is 1/3 paperwork, 1/3 visionary, and 1/3 committees. What I do is most likely done by all of the Chief Architects at all of the various investment banks. There is no secret sauce in what I am about to write. We all have the same kinds of paperwork, we all sit on the same kinds of technical committees, we all meet with the same vendors, and we all read the same magazines and blogs.

The Process

In a company as big as mine, there is an incredible amount of process. Everything has to be approved in triplicate. Every request for a "restricted" piece of software has to be overridden by me. Every request for a server has to be approved by me. I have to eyeball every new purchase, and I have to track which projects have end-of-life components. I also have a dotted line into the CTO of Capital Markets, so Equities has to make sure that they align themselves with whatever roadmap the CTO is working on.

I wake up at 5:30 in the morning, turn on my computer, log in to my work email, and I spend the next half-hour doing approvals. First comes the server approvals, which are easy. Then come the requests for restricted software. Most companies have a Buy/Sell/Hold rating on every piece of software, and we are no different. We have Approved/Restricted/Prohibited. For some reason, the people who rate the software have decided that some essential pieces of software that people need to do their jobs are in the Restricted category, which means that I need to approve their usage. So, everyone who needs this software has to write a rationale for why they need it, which I need to read, and then either approve or ask questions about.

I also need to conduct Architectural Reviews on the major Equities projects. When you take TARP money, your company is under increased scrutiny. You are audited internally by the Audit team, who like to pick projects and start asking tough questions, mostly around security and high-availability. This is supposed to prepare you for audits by the government, who want to make sure that their TARP investments are well safeguarded. As part of the audit process, you have to show them that the projects have undergone extensive architectural reviews. So, this is another effort that I head up.

As part of the review process, I am constantly looking for duplication of effort, and if I find duplication, I try to push for consolidation. Not surprisingly, there are lots and lots of silos in our division. These silos arise because of the competitive nature between groups, because of acquisitions that we make where the acquired company has their own technology stack, and because of people who we hire from other companies who bring their own ideas as to how things should be done. So, part of the Architect's job is to have a broad vision over both the technology stack and the business flow, and push for consolidation.


The Committees

There are a couple of tech councils that I sit on, some which meet every week and some which meet every month. Most of these tech councils are comprised of the senior technologists and the senior management across the Capital Markets division. What we do is try to tackle a certain topic for a few weeks, like Messaging and Caching. We try to see what is being done now in the firm, what needs we will have, and we try to come up with the ideal future state. From time to time, we also bring in certain vendors to give us a level-setting in our knowledge of a certain technology stack. Committees often involve creating written reports and nice Powerpoints, and yours truly has his share of that kind of work to do.

When the committees make a decision, there is usually some degree of backlash. There are groups who have been invested in certain technologies, and those technologies have just been declared as non-strategic. There are groups who need to support users who are still running Windows 2000, and certain technologies just won't work on older machines and ten year old operating systems. And, there are people who just will not listen to any decision that is made by committee (with good reason). So, part of the committees' jobs are to go out and try to sell the decisions to the populace at large. Which involves more meetings and more Powerpoints.


The Visionary Stuff

This is the fun part of the job. We have to drive the company forward with our technology stack. So, I get to read a lot of blogs, read a lot of magazines like Wall Street and Technology, Waters Magazine, Automated Trader, and the A-Team reports, and I get to see what new technology is out there and what other companies are doing. If something looks interesting, I try to invite the vendors in for a talk. Sometimes these talks result in a POC with the vendor, and I have to coordinate the POC and make sure that other interested people in Equities can take advantage of the POCs. We are interested in all of the usual stuff .... Complex Event Processing, Messaging, Caching, Hardware Acceleration, better and faster ways to create models, faster ways to compute prices, ways of monitoring and improving latency and throughput, etc.

I also get asked for a lot of advice from newly-formed companies. There are a number of start-up companies (and very big companies) who are targetting products towards Capital Markets, and from time to time, I receive requests from these companies for a meeting in order to help validate their ideas. Some people might say that we are spending our valuable time giving free consulting to a small company, but if we really like something, and we are the first ones on the block to see their technology, then we might decide to make a principal investment in the company.


So, please bear with the sparseness of my blogging activities while I try to do things that will make my company a bit more competitive in the market .....



©2009 Marc Adler - All Rights Reserved.
All opinions here are personal, and have no relation to my employer.