Splunk datetime.xml needs your attention! Deadline? 1st January 2020
You are here: Home \ Uncategorized \ Splunk datetime.xml needs your attention! Deadline? 1st January 2020
26 November 2019 - 21:23, by , in Uncategorized, No comments

For all of you Splunk users out there listen up!

There is an important issue you need to address with your Splunk deployments today. It’s recently been disclosed that there is a date-time configuration issue on the current versions of Splunk Enterprise which will cause issues with some data onboarding in your environments come 1st January 2020.

The issue? If any of your data is arriving into your Splunk environment with a 2 digit year like 01/01/20, then Splunk may not correctly identify this as the year 2020- and the time will either be extracted incorrectly, or default back to the ‘indexed time’ i.e the time that Splunk ingested the data. That’s not good! This could lead to missed alerting, inconsistent data, or even issues with retention policies because of the incorrect timestamps. Once that data has arrived in Splunk there’s no way to correct this, so please read on and take the appropriate steps for your organisation.

Whats in scope of this?

Any ‘full’ instance of Splunk Enterprise; namely Search Heads, Indexers, Heavy Forwarders, Deployment Servers, Monitoring Consoles, etc.

Any Universal Forwarder where INDEXED-EXTRACTIONS are being used instead of the ‘magic 6’ date-time parsing configurations in props.conf

The good news is that it’s easily fixed through a number of options which we detail below.  Further information can be found at Splunk’s documentation site here: https://docs.splunk.com/Documentation/Splunk/8.0.0/ReleaseNotes/FixDatetimexml2020

NB. Splunk Cloud? Don’t Worry! The Cloud Operations team is already handling the rollout of this fix to your cloud, and you will be contacted over the next few weeks to confirm and approve this upgrade activity. Just make sure you take care of your on-premise components as below!

On Premise instances and deployments?

The best course of action is to Upgrade your Splunk deployment within your current release version (eg versions 7.1.10, 7.2.9.1, 7.3.3, or 8.0.1) GOOD OPTION

Yes, new versions of Splunk Enterprise will fix the issue with the configuration files causing this 2 digit date issue; namely the datetime.xml configuration bundled in your Splunk installation.

If a full upgrade isn’t an option ( like if you are currently on 8.0.0 and awaiting 8.0.1 release ) then keep reading.

Download the updated datetime.xml file which contained the fixed configuration. You can either deploy to your Splunk instances ( overriding the $SPLUNK_HOME/etc/datetime.xml directly ) or build an App package to handle it (details below)

If you modify the file directly it introduces the least change but heads up; it will also mean your file integrity checks inside Splunk will fail due to this core configuration file being modified; don’t be alarmed, this system will still work with this change but this check will fail until you complete the ‘full’ patch release or future upgrades

You could also Modify the datetime.xml manually on each Splunk server ( not recommended )

While you could modify the file manually this introduces more risks and potential for ‘human error’ in your environment and could cause additional issues….

but what do we at 13Fields recommend?

There’s an app for that!

The great minds at Splunk Professional Services came up with this one, and it’s definitely our preferred method- by using the built-in Splunk approach of apps and add-ons, we can deploy a config in an app! How exactly? Drop an app into your Splunk environment called something like ‘easybake_datetime_fix’ using the above-linked datetime.xml file and setup a props config as follows:

/etc/apps/easybake_datetime_fix/local/props.conf
[default]
DATETIME_CONFIG=/etc/apps/easybake_datetime_fix/local/datetime.xml

Put the updated datetime.xml file into the ‘local’ directory of the app- alongside the props.conf file- and you’re done.

Then you can use your current app or add-on distribution technique (eg Cluster Master, Deployer, Deployment server ) or other config orchestration tools (eg Ansible, Puppet, etc ) to push this new app out to all Splunk Enterprise and UF instances.
Eventually, when you naturally reach an upgrade point, then you can then remove this override application from your deployment and move back to the default configuration.

Finally, how to test

If you want to go through some proper testing of the above solution(s) then you should ‘one shot’ or upload a file based input with the time modified to a 2 digit 01-01-20 or 01/01/20 type format and ensure Splunk detects the timestamp properly as the year 2020- not 1920 or even just 20!

If you need any assistance or guidance in how to roll this change out in your environment then don’t hesitate to ‘getintouch@13fields.co.uk’ and we’d love to help.

About author:

Leave a Reply

Blog Stats

  • 8,733 hits

Post Calendar

November 2019
M T W T F S S
« Sep    
 123
45678910
11121314151617
18192021222324
252627282930  

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 80 other subscribers.

Recent Posts: 13Fields

Kaitlyn’s Conf2019 experience

Kaitlyn’s Conf2019 experience

In October I had the absolute privilege of going to Las Vegas to attend my first Splunk .Conf conference with the rest of the 13Fields team. Not only my first conference but my first trip to America- and it definitely did not disappoint!

Splunk datetime.xml needs your attention! Deadline? 1st January 2020

Splunk datetime.xml needs your attention! Deadline? 1st January 2020

For all of you Splunk users out there listen up! There is an important issue you need to address with your Splunk deployments today. It’s recently been disclosed that there is a date-time configuration issue on the current versions of Splunk Enterprise which will cause issues with some data onboarding in your environments come 1st…

Conf 19 Retrospective

Conf 19 Retrospective

So it’s been a few weeks since the amazing event that was Splunk .Conf19 wrapped up in those amazing lights of the Las Vegas strip…. but we haven’t got over it just yet! At 13Fields we are still feeling the high of the amazing experiences, opportunities, announcements, partnerships, and social opportunities that have come out…

BSides MCR 2019

BSides MCR 2019

This week I was one of the lucky people who got to go along to Bsides Manchester, and not for the first time the team blew me away. Over the years I have been to a few Bsides events and have to admit that I am one of the few who have grown tired of…

Why a Narrative is just as important

Why a Narrative is just as important

We deal with terabytes of data each week; we see the power of this data unleashed in multiple clients and how it is empowering various functions to make smarter, quicker and better decisions to solve IT, Security and business challenges big and small. But, there is something that can just as, or even, more powerful…