So today was day one of Splunk .conf2017. This being my first time at .conf, I wasn’t entirely sure what to expect. The morning started off with the keynote address by the CEO of Splunk, Doug Merritt. A couple of interesting numbers to start with. 7,187 people were registered to attend .conf this year from 65 countries who traveled a combined 65 million miles to get to Washington DC (enough miles to go to and from the moon over 100 times). While a good chunk of the presentation was the expected infomercial for Splunk, there were a few cool things. The first was a small chat with Michael Ibbitson who is the Executive Vice President, Technology & Infrastructure at Dubai Airports. What made his part really cool is the sheer scope of what they are doing with Splunk. They have gone much further than just using it to monitor their IT systems, they are using it to monitor EVERYTHING! From IT systems, to suite cases, to whether or not people are washing their hands after using the bathroom (only about 75% are). Dubai Airports is grabbing as much data as humanly possible and using it to change the way that they operate.
One of the big announcements today was the release of Splunk 7.0 as of this morning. I have had exactly zero time to look at this (plan on doing it in the very near future) but the improvements look really cool. While many of the changes are cosmetic, one of the big improvements is the way that it does it’s searches. On it’s on, 7.0 is supposed to greatly improve the efficiency of the hardware that it’s running on (meaning faster searches). Where the HUGE improvement though is with the release of metrics and mstats. If you are familiar with Tstats, you know how fast they are. Metrics are a new way of handling structured logs (think processor utilization, disk space, etc). Because they are structured, Splunk is able to handle them much more efficiently. One of the demos that they showed was a query of about 270 million records. Using the regular stats command several test queries took in the neighborhood of about 150 seconds to run. Using the mstats command the same query ran in about 5 seconds. That is a huge reduction in time, especially when you expand that record count into the billions and more. Metrics have some very specific requirements that I still need to look into but you can start to read about them here.
Another announcement this morning was about about their non-profit arm, Splunk4Good. They are working with other non-profit groups to give away Splunk and training at greatly reduced or free prices. They showcased a couple of pretty cool projects that they are working on, and I personally went by their booth and signed up to volunteer my time/expertise to possibly help out. Where this is important for all of us is that they also announced the creation of veterans.splunk.com. Essentially, they are offering all vets and active duty service member FREE Splunk training. For someone who is out, or getting ready to leave the service, this is a really big opportunity that you should look into.
I had originally planed to write more about some of the individual sessions but it’s 10 pm, I’ve had a long ass day and more to come tomorrow so sessions will have to wait for another day.