You work in an operations group for a streaming media service company. Your team wants to process web logs from your server farms to obtain operations analytics and to identify maintenance issues.
Your back-end system collects data regarding server access and system load in your server farms. Your team wants to identify the operations that have created the most server load in the past few weeks. You want to store data afterwards for auditing purposes.
Before your data analysts can begin working with the data, you need to parse the data. However, the logs are semi-structured, and after server upgrades the log file structure might change slightly and some of the information might take a different format. With a standard transformation, this would cause data loss or log processing failures.
If the input data contains headers, Intelligent Structure Discovery supports data drift to different locations. If the input data does not contain headers, Intelligent Structure Discovery identifies additional data at the end of the input.
Your initial log files have the following structure:
05967|2014-09-19|04:49:50.476|51.88.6.206|custid=83834785|cntry=Tanzania|city=Mtwango|movie={b1027374-6eec-4568-8af6-6c037d828c66|"Touch of Evil"}|paid=true 01357|2014-11-13|18:07:57.441|88.2.218.236|custid=41834772|movie={01924cd3-87f4-4492-b26c-268342e87eaf|"The Good, the Bad and the Ugly"}|paid=true 00873|2014-06-14|09:16:14.522|134.254.152.84|custid=58770178|movie={cd381236-53bd-4119-b2ce-315dae932782|"Donnie Darko"}|paid=true 02112|2015-01-29|20:40:37.210|105.107.203.34|custid=49774177|cntry=Colombia|city=Palmito|movie={ba1c48ed-d9ac-4bcb-be5d-cf3afbb61f04|"Lagaan: Once Upon a Time in India"}|paid=false 00408|2014-06-24|03:44:33.612|172.149.175.30|custid=29613035|cntry=Iran|city=Bastak|movie={3d022c51-f87f-487a-bc7f-1b9e5d138791|"The Shining"}|paid=false 03568|2015-01-07|11:36:50.52|82.81.202.22|custid=27515249|cntry=Philippines|city=Magallanes|movie={ad3ae2b4-496e-4f79-a6dd-202ec932e0ae|"Inglourious Basterds"}|paid=true
After server upgrades, some log files have the following structure:
The data format varies, and some of the data has drifted to a different location.
The following image shows the data variations:
Instead of manually creating individual transformations, your team can create an intelligent structure model to determine the relevant data sets. You create an intelligent structure in Intelligent Structure Discovery and automatically identify the structure of the data.
The following image shows the intelligent structure that you create:
When you examine the data, you realize that the first element in the model, number, actually represents the user transaction identification. You change the element name to transactionId.
The following image shows the updated intelligent structure:
After you save the intelligent structure as an intelligent structure model, you create a Structure Parser transformation and assign the model to it. You can add the transformation to a Data Integration mapping with a source, target, and other transformations. After the mapping fetches data from a source connection, such as Amazon S3 input buckets, the Structure Parser processes the data with an intelligent structure model. The transformation passes the web log data to downstream transformations for further processing, and then to a target, such as Amazon S3 output buckets.