Testing Galera in Kubernetes

I deployed a 3 node Galera cluster in Kubernetes. Galera clusters MariaDB or MySQL, allowing you to read and write to all nodes while always being consistent (ACID) at all times. Kubernetes is a deployment environment for container applications. 

Here are key features of Galera:

– It uses MariaDB instances, and then uses a plug-in to cluster them. So, you are still using stock MariaDB instances.

– It uses InnoDB table types, which has been my default since introduced, and now the OTB default. It introduced ACID to MySQL long ago.

– Every node is a master/slave. So you can write to any node.

– Unlike typical horizontal clustering, which typically offers eventual consistency, this provides consistency across nodes at all times.

What this means is from a functional perspective, you can continue to use for your OLTP applications requiring ACID.

It’s primary benefit is when a node fails, as long as quorum is met (majority of nodes still up), the database remains available for transactions.

Enter Kubernetes (K8S), and a node failure is quickly remedied by K8S as soon as it can. I kill a node, it brings it back up within a minute or two. In the meantime, the other 2 of 3 nodes remain up, and continue to serve transactions since 2/3 is a majority. This is the primary benefit of Galera, and Kubernetes is the ideal environment for it.

While Galera doesn’t provide load balancing, K8S does, as you connect in K8S to the single service name that routes the connection to a node that is currently available.

I tested this, and added a row to a database to one of the up nodes while a node I just killed was being recreated automatically by K8S, yet still down. When the killed node was restored, it too had the new row in the table. So, new nodes “catch up” to missed transactions automatically.

I have not reviewed the performance impact; but, guaranteeing consistency across nodes 100% of the time has a performance cost when compared to a horizontal database with eventual consistency. Yet, performance is likely to be better than a single node since replication can be extremely efficient (think low level processing, without having to duplicate query processing). Your primary benefit, though, is higher availability.

Testing in Kubernetes

If you’d like to give it a whirl, here are instructions for how to test it. 

Create a cluster and deploy a 3 node Galera cluster. I had no problem deploying Galera in Google Cloud to a cluster using these 3 YAMLs

View in Kubernetes console

kubectl proxy

Access via

http://localhost:8001/ui

To use Skip and have Admin privileges, load dashboard-admin.yaml, which you can create per these instructions.

In order to test from a local db client, create a port-forward rule.  Here I use a different port because my local machine has its own instance of a MariaDB server listening on 3306.

# Listen on port 13306 locally for port 3306 of pod 'mysql-0'
 kubectl port-forward mysql-0 13306:3306

You can easily kill it and change the pod to jump around from one instance or another.  When I killed mysql-2, I inserted in mysql-0 while mysql-2 was still down.  Then when mysql-2 was back up, I changed the port forward to mysql-2 to verify it had the new row inserted while it was down.  Alternately, you can port forward to all 3 pods on 3 different ports.

To connect, use this from a local client instance where you have MariaDB or MySQL installed:

mysql -h 127.0.0.1 -P 13306 -u root -p

To test the Galera cluster you can follow these instructions.

Cleanup

In addition to deleting the test cluster, you’ll need to delete the Persistence Volumes, which you can find under Google’s Compute Engine Disks if you are using GCP.  

Posted in Data, Technology | Tagged , , , , , , | Leave a comment

Added Charting to Automated Trading System

This is a continuation of Developing an Automated Trading System


Many of us use very robust charting software, including the popular thinkorswim platform, that does more than I plan to create in my system.  The requirement I ran into that could not be met by this software is the ability to chart unique data produced by my system that isn’t available to the third-party platforms, such as back testing results. 

Thus, I needed basic charting that allowed me to analyze things in the context of price history.  While a fully automated system won’t depend on charts, of course.  I — the human — play a role both in its development and improvement, as well as a cohesive role in automation.  To balance the human brain vs AI discussion, the goal is a “cyborg” in the beginning that becomes more and more machine as time passes. Parts that are proven to be successful in production will remain in the cyborg while new parts are vigorously tested. 

I had a few requirements when comparing charting libraries:

  1. Extensible free open-source.
  2. Works with Angular2, our choice for UI.
  3. Can do price history charting well (stock charts).
  4. Can easily add lines (studies and other calculations).
  5. Can update in real-time.

Other bells and whistles were considered, but those were the core requirements.  I chose ng2-nvd3 as it met these requirements and had nice bells and whistles such as zooming and resizing capability, and can be user interactive.  This is a 3-tier stack:

D3.js – a JavaScript library for manipulating documents based on data.
NVD3re-usable charts for d3.js.
ng2-nvd3Angular2 component for nvd3.

The center of the stack is NVD3, as ng2-nvd3 just provides an Angular2 interface to it. Interfacing via ng2-nvd3 worked well.  You have complete access to NVD3 capability.  It also updates the chart when you update the data, as you expect from an Angular2 component.  So, this completely met the Angular2 requirement.  

NVD3 is a bit limited, though.  They have a gallery of charts you can view.  It can produce a nice candlestick or OHLC chart with high, low, open and close bars.  But, you cannot add lines to these, and the multiChart option does not currently support candlestick or OHLC chart types.  The multiChart type includes area, line and bar charting only.  I can live with this limitation for now.  I just have to chart close prices of the original price history as a line, and additional lines for things such as MAs.

Extensibility. In the long-run I’ll one day want a candlestick charts with lines for MAs and other indicators.  I’ll also want lines for fibs, and other types of indicators, such as buy and sell signals, which might be up and down arrows, and other types of notation related to back testing.  There are two silver linings to the ng2-nvd3 stack. 

nvd3 is open source, so it can be easily improved if one is willing to learn the code.  You can copy and edit the Javascript files your installation is using, then optionally turn your changes into a pull request if you want them to become part of the project.  I talked to the primary committer on the nvd3 project, and he’s eager to accept pull requests.  While having updates committed to primary project isn’t necessary, it is ideal so you can continue to easily upgrade in the future as well as share your love.  

On top of this, you can use d3 on your current charts.  I’ve already used it for some non-graphical utilities.  Your code has access to everything ng2-nvd3 and nvd3 has access to, including, of course, the DOM model generated by it.  So, you can easily learn and use D3 yourself to enhance your charts, perhaps to add the buy/sell signals, without even changing the nvd3 code.

Developing with D3 and extending nvd3 involves a learning curve.  While I’m heavily immersed with Typescript in Angular2 — and loving it — this does force you back into old Javascript, as d3 and nvd3 are both written in Javascript, not Typescript.  These are by no means show stoppers.  However, it does impact prioritization of time.  For this reason, I’ve limited myself for now to what I can do out-of-the-box as it permits me to get back to the original reason I decided to add charting next — the ability to view back testing results and signals I create.  

User Interface

The UI consists of 3 Angular2 components.  One child for the price history query parameters.  Another child for adding studies.  And the parent that bring those inputs together and outputs the chart. 

This uses both the Angular2 @Input and @Output decorators that allow you to tie components together.   Because the chart automatically updates when the data changes due to data binding, including chart configuration, you can continue to add to and modify a chart after creating it using the controls. 

Angular2 Charting components

Because each child component requires the user to potentially update multiple fields before the chart can be updated correctly, each one has at least one button (Chart and Add).  When a button is pressed, the parent component receives the output and updates the chart.  Note that the StudyEntryComponent is in early stages of a WIP.  Yet, it can currently be used to add MAs to a chart. 

Charting Input Components

As you make modifications, clicking the Chart or Add buttons updates the chart.  You can also edit current MAs by selecting it, changing it and then clicking Chart.  The next image shows the table that is created as you add or edit MAs along with the resulting chart. 

Charting Output – Comparison with MAs

This chart demonstrates several features using nothing but out-of-the-box nvd3. 

If you resize the browser window, the chart automatically resizes.   While you can’t view the effect in the static image above, trust me, it works.  Have doubts? Check out the demos I linked to earlier.   

You can compare items using two different Y axis.  In this case, the Russell 2000 ($RUT.X) is on the right axis.  This currently creates studies for the underlying asset on the chart.  So, when we add an MA, it appears for both the S&P 500 ($SPX.X) and the Russell.  Being a two dimensional chart, you cannot have more then two Y axis.  If you included a third or more, they will share the right axis, which will be extended to handle the full range of possible values.  The choice of which axis an item belongs it is something you can control as you setup the data.  But, you cannot have a third Y axis.  So, you have to factor this into the design and how raw data is handled, with the impact on the Y range being your primary concern.  Combining an item that ranges from 0 to 2 with an item that ranges from 2000 to 2200 on one Y axis will result in two flat looking lines far apart.   

The user can interactively hide/show any of the lines by clicking the legend.  You can see above that $RUT.X 200 EMA-we and $RUT.X 50 SMA-mo are both hidden because their circles in the legend are not filled in.   

Another feature that differs from some charting software is that interval of the MAs is not limited to the interval of the chart.  While the chart is displaying weekly bars here, we added monthly MAs to the chart.  This is important because the algos will typically use one minute bars for historical data, and one or more per second real-time quote updates; yet, needs to be able to calculate MAs with intervals from 5 minutes to monthly. 

Round Trip Data Flow

Currently, when it needs to update the chart, it simply does a REST call for price history, which has the ability to add studies via parameters.  When those results come back, our UI side transforms the data using Typescript into the representation required to chart it, and simply replaces the data field in the ChartNVD3PriceComponent given to nvd3 to create the chart.  Due to data binding, the chart updates the instant this data is updated.

The REST call itself uses the parameters to construct and invoke a third-party API call.  Our facade to the API converts the raw data returned to POJOs.  Because our interface to the API uses caching, this could be in memory and returned instantly.  With price history in POJOs, our service then adds studies to the data as new fields.  Then, it converts the POJOs to JSON and returns it as the output of the REST.

Our Angular2 component receives this data, transforms into charting representation, and updates the chart data.  

Looking Forward

Adding charting to the application gets us started so we can begin to create JSON of back testing results that can be used to produce charts.  To add back testing results to charts, in Angular2, we’ll be creating a new UI component for defining back testing requirements, much like the one we created to add studies. 

The exception to simply using a one trip REST query might be if the back testing takes longer than it does today due to new complexity and permutations.  In that case, I’m likely to redesign it to simply add it to a back testing request queue; and allow the user to monitor the queue and view when available.  One advantage of this is that it can be viewed at any time later so long as it is on the list of queries that were previously queued.

WebSockets can be used to update the queue in the browser without the user having to click.  You will be able to see, in real-time, the progress of your request. 

WebSockets can be used to update the chart in real-time.  This will be important when  using real-time quotes and monitoring trading.  With the exception of the data coming through WebSockets instead of REST, we won’t need to really change how charting works in Angular2, as it currently updates the chart whenever the data changes.  The only difference will be how the data changes. Since we already use Angular2 for real-time updates of Level I and II quotes, monitoring of predictions, and order flow, using WebSockets to update a chart does not introduces a new technical feat. 

 

Posted in Finance, Investing, Technology, Trading, Web | Tagged , , , , | Leave a comment

Created Backtesting of Signals and Algos

This is a continuation of Developing an Automated Trading System


Began algorithms with simple strategies.  This tests a range of inputs for a strategy. For example, you can test a range of trailing stops from 1 to 15% with 0.5% steps. This will test 30 scenarios with the same data.  

You can combine strategies testing multiple ranges.  If your ranges include 10 target scenarios, and 10 stop scenarios, it will test 100 scenarios, as it will test every combination of your ranges.  There is no limit to the number of ranges you can combine. The REST call to create the backtest parses your strategies, creates entry/exit factories and iterates through the ranges.  

On the entry side, I’m creating indicators that can be used to fire signals.  While the signals are simple today (all true, all false), the logic can become complex as algos become aggregations of signals weighted to make a decision.  This will be fed to machine learning and use other techniques for prediction and optimization. 

Technical description: No new technology here.  This introduces a pattern of phased data enhancement.  

jvest-backtest-data-flowI was recently inspired by the AI series Westworld.  This led me to increase generification and conceptual streaming and phased data enhancement as I imagined the result being a high performance real-time analytics engine that could potentially handle complex decisions beyond the current application.  The goal here is to ultimately build an AI engine with practical purpose driving it rather than theory, as well as a real-time analytics engine that can be deployed to solve a number of problems in various industries. 

For this reason, the back testing algos are designed to support real-time price updates that include time so they can handle their own temporal requirements, much like the human brain continuously analyzing real-time signals to help you make decisions.


Continued posts on Developing an Automated Trading System

Added Charting to Automated Trading System (Jan 18, 2017)

Posted in Business, Finance, Investing, Technology, Trading, Web | Leave a comment

Developing an Automated Trading System

It has been my dream since I was a kid and I saw the Matthew Broderick movie War Games in a theater in 1983 to build an automated trading system using AI.   This year I have begun to live that dream.  I call this project jVest.

Curious how I built it?  Here is a description of what I created:

jVest is a real-time high transaction volume market analysis and trading system implementing machine learning for predictive analytics. It has an HTML 5 UI using WebSockets to continuously stream real-time information to the browser.  The UI itself is largely built using Angular 2 with Typescript. It uses JMS to distribute the incoming third-party data messages after converting them to POJOs to subscribers.  Using a listener, it persists select data in a RDBMS using JPA.  Using another listener, it is streamed to business logic which then streams its transformed output to the web UI via WebSockets.  It also uses the Observer pattern to flow the data through many consumers and producers, and injection (CDI) to simplify the complex interactions and provide a highly extensible design.

It uses REST for user interactions with the server and WebSockets for real-time streaming. To handle conversion of POJOs to/from XML, it leverages JAXB to marshal and leverages the marshalling capabilities of Resteasy in the REST layer.  All transportable POJOs  implement a Marshallable interface providing common default methods to do JAXB mashalling to/from XML and JSON, which is used for WebSockets as well as conversion of XML to POJOs from upstream data providers.

For machine learning, it uses JServe to integrate with R over TCP/IP, which permits high scalability through distributed R nodes.  R then handles the predictive algorithms such as linear regression and KNN. Users can query via the web UI to analyze market data, as well as create monitors that regularly update the predictions in the web UI via WebSockets.

As you can see, it is not only a fun project, but has brushed up skills by using the latest updates to the Java EE stack. On top of that, I learned completely new things, such as Angular 2 with Typescript which is becoming a very popular UI solution that encourages creating highly re-usable UI components; and R, one of the leading languages for doing machine learnings with a large plethora of readily available statistical analysis packages.

Using R to do machine learning is not only cutting edge in terms of technology, it is a rapidly growing area due to the proliferation of large quantities of data in computers, faster computers, and cheaper storage.

Limits to computer speed and storage were the reasons I set aside pursuing AI in the 80s after my initial dabbling with it.  It is exciting to be able to pick up that dream today now that hardware can now do things barely imaginable in the 80s.

Oct 28, 2016 – Added clustering via WebSockets

I added the ability to cluster servers using WebSockets.  This overcomes a limitation. The upstream data provider permits only one live real-time stream per account, though all the nodes can use the upstream provider’s synchronous API (think “REST”).  To overcome the stream limitation, other nodes can now connect via WebSockets to receive the real-time streaming data.  Because each instance of the application can act as both a provider and consumer, this allows for theoretically unlimited scaling of the business process and web/UI tier through a hierarchical topology.  Because all streaming services are initiated through REST or scheduling, this is 100% dynamically configurable at run-time, both from a user-driven perspective and, down the road, perhaps for automated discovery, load balancing and high availability.

Technical description: this uses WebSockets for Java-to-Java in an EE web container.  The provider endpoint picks up messages to send using @Observes, and the client side fires the messages it receives just as it would if they were received in its MDB if it was running the data collector that handles real-time streaming with the upstream provider. This demonstrates the powerful plug-ability and extensibility of the Observer pattern. It uses a WebSockets encoder and decoder to convert all messages to/from JSON for serialization. The en/decoders were easy to create since all message POJOs entering and exiting the endpoints internally support JAXB bi-directional marshalling of both XML and JSON accessible through a common interface. 


Continued posts on Developing an Automated Trading System

Created Backtesting of Signals and Algos (Nov 30, 2016)
Added Charting to Automated Trading System (Jan 18, 2017)

Posted in Business, Finance, Investing, Technology, Trading, Web | Tagged , , , , , , | Leave a comment

Internet Relay Chat (IRC) and the Berlin Wall Falling

msp430_ircl_channel_43ohThis is the story of using the IRC when it first came out, long before most people heard of the Internet.

Connecting Through College

In the late 80s, when I was in college, I created an account on the school’s computers.  I dialed in with my modem from home.  When you dialed in, you were presented with a UNIX prompt. This was an all text world.  No images.  No nice web pages.  Just a command prompt and programs that output text.

In the 80s, all the colleges in the US were connected to the Internet.  There was no commercial dial-up service like AOL, yet.  So, it was virtually all academics and scientists.  There were no corporations.  No one charged for anything.  No one competed.  It was just people connecting to people and information.

One of the first commands I learned was ‘irc’.  Once you use this command to connect to an Internet Relay Chat (IRC) server, you type /help, and from there learn other commands.  I quickly discovered thousands of channels with thousands of people from all over the world.

The purpose was to simply let people chat.  It is a myth that the Internet became social in the late 90s.  The Internet was social from the beginning, particularly with IRC.

What was surprising was that the Internet included people and servers all over the world that spoke many languages, although English did seemed to be the predominant language.

Want to talk to people in Spain?  Join the #spain channel.  Germany?  #germany.  When virtually no one heard of Linux yet, there was always the #linux channel.  Want to create your own channel, just “/join #mychannel” and boom, you just created a new channel.  It was a level playing field in that anyone could create a channel and invite people to participate in it.  And, you could join any open channel.  Though, there were ways to make channels hidden, and require passwords to enter.  There was never a fee, and most were openly there for anyone to join.  The only requirement for using IRC was an Internet connection and a client program like ‘irc’ to connect.

16384419062_7ce01ffda5_c

Free Communication Without Geographic Boundaries

This was an era when international long distance was prohibitively expensive.  Prior to IRC, you’d never dream of talking to people all over the globe.  So, imagine how exciting it was when, in 1990, one year after the Berlin wall fell, I was talking in IRC to someone who grew up in East Berlin!  I asked questions like, what was it like when the wall came down?  What was life like growing up behind the iron curtain?  How are you doing now that 1 year has passed and you’re now integrated with West Germany?

I’m not sure I could of called someone in East Germany, yet, since it was behind the iron curtain just a year earlier when it was unimaginable that you could call people there from the US.  I’m pretty sure you couldn’t do it prior to the wall coming down.  And, even if you could, it would of been very expensive if the person you wanted to call happened to have a phone.  My brother went to Moscow University in 1993 under the Perestroka program.  It cost us $30/minute to call him.  I tried to get him to IRC, of course.  But, that never panned out.

IRC Today

To be sure, it hasn’t changed much today.  It’s bigger, of course.  There are more servers.  There are lots and lots of bots (automated programs) on the IRC.  There are still a lot of people across the globe using it.

However, in an age when most people know the Internet via the face of Google and Facebook, the IRC can seem a bit antiquated.  Yet, for open live text chatting, there’s still really nothing that has truly replaced it.  Yes, you can IM and do other forms of text chat.  But, having a room open 24/7 that anyone can go to and just text chat?  As far as I know, someone has to open a Google hangout and invite people.  There’s no list of thousands of Google hangouts you can join, particularly without knowing anyone in the channels.  What if you want to join a real-time live discussion of a topic you’re interested in?

Client programs for connecting to IRC improved a lot, especially in the 90s, giving a graphical easy to use interface for people.  The good news is these programs are free and have only improved over time.  Whether you are on Windows, Linux, Mac, Android or iPhone, there are great easy to use client programs for connecting to IRC.  Don’t want to download and install a program?  You can now just use your web browser to connect to IRC.

 

 

 

Posted in Technology | Tagged , , , | Leave a comment

July 2016 Jobs Report Reveals US Economic Weakness

The dollar tanked and the the S&P 500 made a new all-time high when the headline news of the jobs report came out.  The probability an interest rate hike for December increased from 32.1% to 45.4% (calculated using 30-day Fed Fund futures prices).

Our trusted media reported headline news such as

U.S. Posts Another Strong Month of Job Gains

However, the news, which many investors and traders trust to make their decisions, has failed to look into the data in the report to understand what it really says.

The Devil in The Jobs Data

ZeroHedge pointed out that Obamacare offset weak industrial and consumer sectors.  In another article, they point out that private payrolls grew an unadjusted +85k in July, far less than the seasonally adjusted headline number of +217k.

Reviewing the labor report myself, I discovered that the only education category for those 25 and older with an increase in actual jobs from June to July was High school grads with no college.  The other 3 categories, including those with some college with or without any degree had a decrease in actual number employed.   (Table A-4)

The number of unemployed from permanent job loss (layoffs) increased from June to July from 1.848 to 2.014 million (+166k). Even the “seasonally adjusted” number, a fictitious number which is of course rosier, showed a 104k increase in permanent job loss.

Of course, with increasing layoffs come longer unemployment times, steadily increasing since May.  Average number of weeks unemployed went up in July to 28.1 from 27.1 in June.

May 26.7
June  27.1
July  28.1

Weakening US Economy

All this data points to a weakening US economy.  Educated workers are losing their jobs, being increasingly laid off.  Those on unemployment are having a harder time finding a job.  The increases that the headline refers to are high school grads taking jobs that do not add much to our economic strength, as they do not replace the high paying jobs being lost.  Many, of course, are temporary jobs due to the election season, which helps to explain the increase in high school grad jobs.

Clearly, as long as we trust a news media to do our analysis for us, and do not hold them accountable to critically review the jobs report, we’ll continue to be deluded by rosy headlines despite the truth being much less bright for the US Economy.

 

 

 

Posted in Finance, Investing, Trading | Tagged , , , , , | Leave a comment

Rosy labor report?

The jobs report this morning caused the S&P 500 to hit new highs of the year, a few points short of its all time high set in May 2015.  It is tempting to pretend like all is well, and just buy stocks, and hope for the best.  Yet, perhaps the best way to protect your nest egg is to take a closer look with a critical eye.

I heard a few unconfirmed things today from traders regarding that report:

  • June was revised down from 38k jobs to 11k jobs
  • A large portion of the new jobs were people 55+

Note that gold and bonds soared today (my two favorite investments of the year).  #1 on the selling into strength list for most of today was SPY (S&P 500 ETF), with the IWD (Russel 2000) at #4.  This is post-brexit profit taking which is common when they believe the market is reaching another top.

ZeroHedge has an interesting critique of the jobs report that soared the markets today:

The Bearish David Rosenberg Reemerges: “What If I Told You Employment Actually Declined 119,000 In June”

Selling into strength:

sellingStrength-2016-07-08

Posted in Finance, Investing, Trading | Leave a comment

Building a long-term position in gold via NUGT

I and others I know have found ourselves chasing gold.  We periodically get a nice position, take our profits, and then find ourselves missing out on the next big move because we cannot find good entry.

This has been driving me nuts all year.  While I finally just bought a gold fund in my 401k in April so I never miss out on an up move again, I’m still far from fully benefiting from the continued rise in gold.

Typically, I prefer gold futures (/GC) as a vehicle.  However, they suffer from two major limitations.  They do not have weekly options, and their options only go out a couple of months.

Thesis: Target for gold is 1600 before the end of 2016.

This is a thesis I’ve held for 2016 since mid-2015.  The first half of the year sure has confirmed the thesis.  I won’t get into all the reasons gold is soaring this year in this post.  But, the positions I’m describing here are based on capturing profits if this thesis continues to prove true.

There is good news.  While gold has risen from 1060.50 since the beginning of the year, gaining over $300, or 30%, in order to hit $1600, gold has about $230+ more to go.  That’s still a nice gain to capture.

So, how do we capture it without constantly chasing price and hoping gold doesn’t soar while we ‘re sleeping in the Asia and European sessions, or looking for a pullback that never comes while it rips in front of us?

Fortunately, in addition to being a 3X leveraged ETF of gold miners with a high correlation to the price of gold, NUGT also has weekly options, and has options all the way to January covering our time frame.

There are some principles to options you’ll want to understand for this strategy:

  1. The further out in time they expire, the lower the delta and theta.  The former is good when it is going against you.  The latter is the cost of having lower short-term price volatility.  We’ll leverage this lower delta to scale our position when we think /gc could ultimately retreat quite a bit, yet know we can’t guarantee how far it will pull back either.
  2. Because we’ll primarily begin by building a position that decays extrinsic value, it is important to understand that extrinsic value is highest at the money (ATM).  The further out of the money (OTM) or in the money (ITM) you go, the less potential profit from the position.
  3. Because NUGT is a 3X ETF, it consumes a lot of buying power even for margin accounts.  We’ll address this by ultimately looking to create verticals.

Long via short puts

At 1357, and having only been in the 1300s for a short while, we view /GC as being 1/2rd through a 1300-1400 range.  Many are betting it will hit 1400 soon, and plan to short it there.  So, it has a decent probability of racing back to support near 1308.  Yet, due to the reasons it is soaring (bonds having negative returns, currencies unstable, and brexit), there is never a guarantee it will come back down that far.  We want to be sure we have a position in case it soars without an ideal pullback.  Yet, we don’t want to be too exposed in case it does drop back near 1300; and, want to be able to add to our position if it does.

Thus, at this level, we’ll begin our position with low delta short puts by going out to December expiry.  For the strike, I choose to be near the money to maximize extrinsic value.  The good thing about December is the premium is high enough to easily get a break even of $100.  Selling a 160 Dec for $55 means your break even at expiry is $105.

The delta on Dec 160 is currently -.29.  That means that the option price is expected to decrease by .29 per share for ever $1 gain in NUGT, presuming volatility doesn’t change, and not taking into account theta burn.  That is what we like being so far above what we currently consider strong support.  We’ll take a lot less heat than a put that will be expiring soon at the same strike if /GC drops $60.

Our goal is to turn this into a vertical, as we believe /GC has a high probability of shooting for $1400 before coming down.  If you are not comfortable opening a naked option position, or don’t like the buying power reduction (BPR), you can just begin with a vertical.  However, I’m choosing to open the short side first, then the long side if /GC goes higher in the near-term.

After selling this put.  I created an order to buy the 140 Dec for $20 less than I received for the 160.  If I’m super lucky, and it fills, then I just locked in max profit on the spread!  Realistically, though, I’ll look for resistance on /GC, notably 1400, and do a cancel/replace for whatever I can get then, because I anticipate a pullback there on first touch.  Regardless, it is likely to be a lot better than what I’d pay today, both because it will be worth less due to the delta, and because time will pass, burning theta.

Note that you are never locked in.  Let’s gold hits 1400, we buy the put creating our spread, then gold drops $90.  We could, at that point, close the long put for a profit, or roll it to a different strike to widen our profit potential.  The idea of putting it on is to lock in a higher probability of profit while creating some downside protection.  Once we’ve used that downside protection, we can choose to remove or reduce it.

If /GC drops before I get a chance to do it, then I’ll just be stuck with a naked put for awhile, and wait until /GC runs again.  Like I said, if this isn’t for you, you can just open a short vertical and be done with it.  I’m just trying to maximize potential profit and probability of profit by putting some swing trading into how the position is created.

If the naked put is a little uncomfortable, but you want to try to time the sides of the vertical, you could start with the long side first.  The down side it will be decaying while you wait for entry on the long side.  The good news is that the decay will be relatively slow since you went out to December.  That Dec 160 currently has a theta of -.16.  Contrast that to the 160 expiring in two days with a theta of -$1.30.  If you do the long side first, then you’ll be hoping for a nice pullback to complete with the short side instead of waiting for /GC to go higher.  If you feel strongly about which direction it is likely to move in the next few days, this can also factor into which side you do first.

Scaling Based on /GC price

What do we do as /GC comes down towards our major support, but isn’t there yet.  Remember, there’s no guarantee it will get there.  So, we want to balance the possibility it can just drop to 1340 and bounce and never go below it again for the rest of the year, with increasing risk and potential profit as it approaches 1300.  To do this, I’ll use closer expiries as it drops.  Perhaps Nov in 1340-55 range, Oct in 20-40 range, etc,…  The closer it gets to the bottom of it’s potential range, the more delta we’ll be willing to risk to collect more from theta burn.  🙂  This is optional.  You could stick to Dec.  I just like to increase reward/risk as it approaches support.

Note that once it gets down to 1300-15, even a short OTM put spreads can be very lucrative.  Analyze these and decide if they aren’t good option for you.  I just don’t think they are lucrative enough until /GC is down there to be worth the heat.  But, they are on my list of potential positions to open there.

Long Calls to Maximize Profit Potential

The next part of this strategy is what we’ll do if /GC pulls back near 1300, where we believe there is strong support and it will likely bounce like the last time it came near 1308.  For one, you can immediately sell short puts that expire in the near-term for quick profits on that bounce, or even if it ranges there for a bit, as theta will decay fast.  Then, take the cash you raise from that, and buy some Dec calls.  You’ll have to pick the strike you like and are willing to pay for.  But, here’s where you turn a potentially profitable position into a really potentially profitable position.  The potentials gains of Dec NUGT calls if /GC goes from 1300 to 1600 by then are huge.  The good thing is you’ve effectively financed these with short puts.

To be clear, there is a lot of downside risk to this position.  I have strong conviction so am not too concerned about that.  Yet, once your put side consists primarily of verticals, your risk will be limited.  If you timed it will, then you really reduced your risk.  If you get lucky and the difference between price you collected from short side and long side is same as spread, you have NO RISK on that spread as you already collected max profit, and will just wait for payday!

Variations

To be sure, you can use different underlyings and combine them in different ways.  The important thing is you are capturing both delta and theta burn on anticipation of an up move in gold, with little to no risk if gold doesn’t climb, and you are managing and limiting risk to the downside.  You’re also timing it to obtain the best position given the uncertainties.

Alternatives include using /GC options, GLD options, or anything else that moves with the price of gold.

 

 

Posted in Investing, Trading | Tagged , | Leave a comment

US Commercial Real-Estate Storm Brewing?

One counter I hear to the possibility of a recession coming to the US before the end of the year is that real-estate is booming.  Is it?  It has for 6 years, but this year doesn’t look as rosy.

In this Bloomberg article, the article suggests there are several reason this market could drop within the next 12 months:

Pimco Says ‘Storm Is Brewing’ in U.S. Commercial Real Estate

Signs of a cooling real estate market have emerged across the country since the start of the year. Commercial-property values in big U.S. cities, which have seen the largest increases during the recent boom, have declined 3 percent in the past three months, Moody’s Investors Service and Real Capital Analytics Inc. said in a June 6 report. Real estate transactions in New York, the biggest U.S. property market, are forecast to decline by as much as 30 percent this year, brokerage Cushman & Wakefield said in April.

 

Posted in Business, Finance, Investing, Trading | Leave a comment

Major Economies Are Not Raising Rates in 2016

If the US raises central bank (CB) rates this year, it looks like it would be the only major economy to do so.  So far in 2016, major economies are either lowering rates, or leaving them steady.  The ECB, for instance, hasn’t changed rates yet in 2016.  But, that’s because they are at all time lows of 0%.  Other European countries, such as Sweeden, Denmark and Switzerland, have been experimenting with a so-called Negative Interest Rates Policy (NIRP).

So, who is raising rates?  Looking at the list of rate changes in the first 5 months of 2016, the two largest countries to of raised rates are Argentina and Denmark.

Argentina.  According to the IMF, at less than 1/30th the size of US GDP, Argentina is ranked 21 in the world for GDP in 2015.  But, not only is the largest economy to raise rates this year relatively small, it is raising rates to combat hyperinflation after years of printing their currency, a result of years of political corruption of populist leaders printing to meet campaign promises. So, it should not surprise anyone that they raised rates from 35.43% to 36.9%.  On the plus side, this is lower than their all-time high of 1390%.  I also heard from a resident of Argentina that they recently elected a leader who is restoring a free market to Argentina, and trying to reign in the craziness that lead to constant depreciation of their peso.  That friend cautioned, however, that it will take years for the reforms to take hold and restore this economy.  I was a bit more optimistic about the pace until I saw that they had to raise rates again.

Denmark.  At half the size of Argentina, the IMF ranks them in 36th place for GDP.  If you believe that central bank depository interest charges are “negative interest rates”, then they kicked off the year by raising their CB rate from -0.75% to -0.65%.  Many don’t expect another increase until next year, and don’t expect to see positive rates until at least 2018.

The two largest economies to raise rates so far this year are, in fact both relatively small compared to the US, EU, China and Japan.  And, clearly, they are both in extreme situations, one fighting hyperinflation, the other fighting deflation.  Neither move is an ordinary increase designed to cool off an overheating economy to smooth out the business cycle, the primary justification for why the US has claimed it raised rates in the past.

If the US raises rates this year, it looks like it could be very much alone in the world.  In my lifetime, I have never seen such a context.  Not sure it mattered prior to my lifetime as the economy wasn’t globally connected back then.  This is history in the making.  Can the US raise rates in a global context where the world is combating slowing growth, deflation, and in a few countries, currency printing induced hyperinflation?

 

Posted in Business, Finance, Investing, Trading | Leave a comment