Papercut: Calc script verification with FDM tokenization

Here’s a papercut I’d like to present in the context of my thoughts on papercuts in the Essbase ecosystem. I’ve recently been doing a bit more work with FDM. After an FDM data load you need to calc data (related to what you just loaded, although I suppose you could just calc the whole thing if you wanted to) related to the intersections you just loaded. In other words, if you are loading data for a certain time period or location or whatever, you’ll want to roll that data up. Nothing special there. So you have a normal calc script except it has been parameterized for FDM – it searches for tokens in the script, such as in FIX statements, then replaces a template variable with the real variable. So it’s like [T.Location] gets replaced with the actual location. But guess what, when you go to validate the calc script now (and you do always validate your calc scripts, right?), it doesn’t validate.

Hmm.

So, I’m not an FDM expert. Maybe there’s an option to work around this that I don’t know about. Maybe you can stuff these tokenized names into a dummy alias table so that you can at least validate. But it seems like the “right” way to handle this would be to find a solution where you can still validate the calc. I guess one straightforward way to do this might be an option to ignore values with brackets around them that are enclosed in quotes. But it feels wrong that using FDM and tokenizing your calc script leads you down this path. If you have worked with this and have a solution that I don’t know because I’m not an FDM expert and you are, please let me know. But right now it’s just one of those quirky little less than ideal issues that I consider an Essbase Papercut.

In case you missed it: Oracle Data Integrator success story webinar

A little while ago I presented a success story using Oracle Data Integrator. The webinar goes over ODI functionality at a high-level, giving a quick tour of the benefits of major functional areas and how they were used to greatly improve and build out the ETL and automation at a high profile client. If you are interested in using ODI in your organization or aren’t sure (but want to know more!) then I strongly encourage you to check it out. ODI is one of my specialization areas now where I have been doing a tremendous amount of work, so I look forward to becoming an expert in this technology and sharing this expertise with others.

Note that this webinar, while technical, is not a deep drive into implementation details. I am planning a followup webinar to very comprehensively look at using ODI as an ETL/automation solution and implementing it with best practices. This will be the webinar to look for if you are more technical and really want to see the ins and outs of how things work and can be architected for robustness, maintainability, and efficiency.