FCPA Compliance and Ethics Blog

April 15, 2015

Five Step Process for Transaction and Continuous Controls Monitoring

Five Step ProcessMost Chief Compliance Officers (CCOs) and compliance practitioners understand the need for transaction monitoring. Whether it be as a part of your overall monitoring of third parties, employees, or to test the overall effectiveness of internal controls and compliance, transaction monitoring is clearly a part of a best practices compliance program. Further, while most compliance practitioners are aware of the tools which can be applied to transaction monitoring, they may not be as aware of how to actually engage in the process. Put another way, how do you develop a methodology for building a transactional monitoring process that yields sustainable, repeatable results?

I recently put that question to one of the leaders in the field, Joe Oringel, co-founder and principal at Visual Risk IQ. He explained to me that their firm has dissected data analytics and transaction monitoring into a five-step process they call QuickStart, which facilitates applying the process iteratively across a two to four month time frame. These iterations allow for, and reinforce the methodology’s repeated and practical application and reapplication. The five steps are (1) Brainstorm, (2) Acquire and Map Data, (3) Write Queries, (4) Analyze and Report, and (5) Refine and Sustain.

Brainstorm

Under this step, the transactional monitoring specialist, subject matter expert (SME), such as one on the Foreign Corrupt Practices Act (FCPA) or other anti-corruption law, and the compliance team members sit down and go through a multi-item list to better understand the objectives and set the process going forward. The brainstorming session will include planning the monitoring objectives and understanding the data sources available to the team. Understanding relationships between the monitoring objectives and data sources is essential to the monitoring process. During brainstorming, the company’s risk profile and its existing internal controls should be reviewed and discussed. Finally, there should be a selection of the transaction monitoring queries and a prioritization thereon. This initial meeting should include company representatives from a variety of disciplines including compliance, audit, IT, legal and finance departments, sales and business development may also need to be considered for this initial brainstorming session.

While the rest of the steps may seem self-evident in any transaction monitoring process, it is the brainstorming step which sets the Visual Risk IQ approach apart. This is because business knowledge is critical to sustaining and improving the transaction monitoring process. And because the process is iterative, periodic meetings to further understand the business pulse allow the most useful data to be monitored through the system. 

Acquire and Map Data

The second step is to obtain the data. There may be a need to discuss security considerations, whether or how to redact or mask sensitive data, and ensure files are viewable only by team members with a “need to know”. Balancing, which consists of comparing the number of records, checksums, and controls totals between the source file (as computed by the file export) and then re-calculated number of records, checksums, and control totals (as computed by a file import utility). Balancing is performed to make sure that no records are dropped or somehow altered, and that the files have integrity. Somewhat related is making sure that the version of the files used is the “right” one. For example if you are required to obtain year-end data year-end close could be weeks after the closing entries have been actually recorded, depending on the departments engaged in the year end processes.

Types of systems of record could include Enterprise Resource Planning (ERP) data from multiple transaction processing systems, including statistics on numbers and locations of vendors, brokers and agents. You may also want to consider watch lists from organizations such as the Office of Foreign Asset Control (OFAC), the Transparency International – Corruption Perceptions Index (TI-CPI), lists of Politically Exposed Persons (PEPs) or other public data source information. Some of the data sources include information from your vendor master file, general ledger journals, payment data from accounts payable, P-cards or your travel and entertainment system(s). You should also consider sales data and contract awards, as correlation between spending and sales as these may be significant. Finally, do not forget external data sources such as your third party transactional data. All data should initially be secured and then transmitted to the transaction monitoring tool. Of course you need to take care that your transaction monitoring tool understands and properly maps this data in the form that is submitted.

Write Queries

This is where the FCPA SME brings expertise and competence to assist in designing the specific queries to include in the transaction monitoring process. It could be that you wish to focus on the billing of your third parties; your employee spends on gifts, travel and entertainment or even petty cash outlays. From the initial results that you receive back you can then refine your queries and filter your criteria going forward. Some of the queries could include the following:

  • Business courtesies to foreign officials;
  • Payments to brokers or consultants;
  • Payments to service intermediaries;
  • Payments to vendors in high risk markets;
  • Round dollar disbursements;
  • Political contributions or charitable donations; and
  • Facilitation payments.

Analyze and Report

In this process step, you are now ready to begin substantive review and any needed research of potential exceptions and reporting results. Evaluating the number of potential exceptions and modifying queries to yield a meaningful yet manageable number of potential exceptions going forward is critical to long-term success. You should prioritize your initial results by size, age and source of potential exception. Next you should perform a root cause analysis of what you might have uncovered. Finally at this step you can prioritize the data for further review through a forensic review. An example might be if you look at duplicate payments or vendor to employee conflicts. Through such an analysis you determine if there were incomplete vendor records, whether duplicate payments were made and were such payments within your contracts terms and conditions.

Refine and Sustain

This is the all-important remediation step. You should use your root cause analysis and any audit information to recalibrate your compliance regime as required. At this step you should also apply the lessons you have learned for your next steps going forward. You should refine, through addition or deletion of your input files, thresholds for specific queries, or other query refinements. For example, if you have set your dollar limits so low that too many potential exceptions resulted for a thoughtful review, you might raise your dollar threshold for monitoring. Conversely if your selected amount was so low that it did not generate sufficient transactions, you could lower your parameter limits. Finally, you can use this step to determine the frequency of your ongoing monitoring.

Oringel concluded by emphasizing the iterative nature of this process. If you can establish your extraction and mapping rules, using common data models within your organization, you can use them to generate risk and performance checks going forward. Finally, through thoughtful use of transaction monitoring parameters, you can create metrics that you can internally benchmark your compliance regime against over time to show any regulators who might come knocking.

For further information on this process, contact Joe Oringel at Joe.Oringel@VisualRiskIQ.com

This publication contains general information only and is based on the experiences and research of the author. The author is not, by means of this publication, rendering business, legal advice, or other professional advice or services. This publication is not a substitute for such legal advice or services, nor should it be used as a basis for any decision or action that may affect your business. Before making any decision or taking any action that may affect your business, you should consult a qualified legal advisor. The author, his affiliates, and related entities shall not be responsible for any loss sustained by any person or entity that relies on this publication. The Author gives his permission to link, post, distribute, or reference this article for any lawful purpose, provided attribution is made to the author. The author can be reached at tfox@tfoxlaw.com.

© Thomas R. Fox, 2015

February 4, 2015

Five Tips for Advancing with Audit Analytics-Part III

Filed under: Best Practices,Big Data,Data Analytics,Joe Oringel,Visual Risk IQ — tfoxlaw @ 12:01 am

Oringel - new pic Ed. Note-Joe Oringel, Principal at Visual Risk IQ recently wrote a series of blog posts on advancing your business through the use of data analytics and audit. I asked Joe if I could repost his articles, which he graciously allowed me to do. So today I begin a day 3-day series of blog posts which reprint his post. Today is the final post, Tip 5. 

Tip 5 – Supplement Necessary Skills with Internal or External Resources

This week we have been posting about how to succeed with data analytics in areas such as internal audit and compliance. Monday we introduced the following Body of Knowledge and indicated that each of the skills below are often needed for a data analytics project.

  • Project Management
  • Data Acquisition and Manipulation
  • Statistical techniques
  • Visual Reporting techniques
  • Communication
  • Audit and Compliance Domain expertise
  • Change Management and Strategic Thinking

Does this mean that audit teams need a statistician or visual reporting whiz in the department? Not at all. Just as audit teams co-source with supplemental resources, they can also co-source for data analytics. Better still, co-sourcing with internal company resources, in the form of a secondment or guest auditor is often possible. Reach into IT’s Business Intelligence or data warehouse group, and internal audit can find talent with excellent company and data manipulation expertise. Reach into HR or Finance for someone with domain expertise around incentive compensation and team on that important Sales commission audit project.

Will these resources have advanced audit or compliance domain expertise? Probably not, but Tom Brady doesn’t play running back or wide receiver yet he makes those players better by fitting the pieces together. Audit and compliance leaders know what questions we want to answer. It’s the “how” where we sometimes need help. At Visual Risk IQ, I have the very good fortune to work with an incredibly talented team that is deep in database design, data manipulation, programming, and visualization skills. We work together to make sure that our queries are answering the right business questions, and in turn that those answers are being communicated in a way that is precise and easy to understand.

When we have first worked in domains where our experience had been limited (e.g. Health claims in 2008, FCPA / anti-corruption in 2010, or HR in 2013), we relied heavily on domain expertise from our clients’ General Counsel’s office or on consultants to our firm, so we could bring the full expertise needed for a project, given the body of knowledge framework above. This technique has worked consistently for us, and it works for audit and compliance too.

Why are audit analytics so important? First, through the use of audit analytics as a monitoring tool it can lower audit costs by eliminating manual sampling. Second, audit analytics can improve financial governance by increasing the reliability of transactional controls and the effectiveness of anti-corruption controls. Third, they can improve actual operational performance by monitoring key financial processes.

However it may be more simply put in the context of McNulty’s Three Maxims of the three general areas of inquiry the Department Of Justice would assess regarding an enforcement action. First: “What did you do to stay out of trouble?” second: “What did you do when you found out?” and third: “What remedial action did you take?”

The Visual Risk IQ studies include a case study of both accounts payable and of purchase card spend to determine if there was fraud and misuse of the cards. The key in both of these reviews, involving continuous controls monitoring situations was that of data review. This same type of testing can be utilized in reviewing foreign business partners, including agents, resellers, distributors and joint venture partners. All foreign business partner financial information can be recorded and analyzed. The analysis can be compared against an established norm which is derived from either against a businesses’ own standard or an accepted industry standard. If a payment, distribution or other financial payment out or remuneration into a foreign business partner is outside an established norm, thus creating a Red Flag, such information can be tagged for further investigation.

Many companies have yet to embrace post FCPA compliance policy audit analytics implementation as a standard part of their compliance program. They have found that it is difficult to test behavioral aspects of a FCPA compliance policy, such as whether an employee will follow a company’s FCPA-based Code of Conduct, other testing can be used to form the basis of a thorough review. For instance, it can be difficult to determine if an employee will adhere to the requirements of the FCPA. However continuous controls monitoring can be used to verify the pre-employment background check performed on an employee; the quality of the FCPA compliance training an employee receives after hire and then to review and record an employee’s annual acknowledgement of FCPA compliance. For a multi-national US company with thousands of employees across the world, the retention and availability of such records is an important component not only of the FCPA compliance program but it will also go a long way to a very positive response to McNulty’s inquiry of “What did you do to stay out of trouble?”

Good luck in 2015 with your data analytics projects! Please write or call if you’d like to compare ideas on how to excel in data analytics for audit or compliance. We’d be happy to assist in your success!

Joe Oringel is a CPA and CIA with 25 years of experience in internal auditing, fraud detection and forensics. He has over ten years of Big 4 external audit, internal audit, and advisory experience, most recently with PricewaterhouseCoopers. His corporate experience includes information security, internal auditing, and risk and control of large ERP systems for companies in highly regulated industries, including Pharmaceuticals, Utilities, and Financial Services. Partner Kim Jones and Joe founded Visual Risk IQ in 2006 as an advisory firm focused solely on Data Analytics, Visual Reporting, and Continuous Auditing and Monitoring. He can be reached at joe.oringel@visualriskiq.com

This publication contains general information only and is based on the experiences and research of the author. The author is not, by means of this publication, rendering business, legal advice, or other professional advice or services. This publication is not a substitute for such legal advice or services, nor should it be used as a basis for any decision or action that may affect your business. Before making any decision or taking any action that may affect your business, you should consult a qualified legal advisor. The author, his affiliates, and related entities shall not be responsible for any loss sustained by any person or entity that relies on this publication. The Author gives his permission to link, post, distribute, or reference this article for any lawful purpose, provided attribution is made to the author.

 © Joe Oringel 2015

February 3, 2015

Five Tips for Advancing with Audit Analytics-Part II

Filed under: Big Data,Data Analytics,Joe Oringel,Visual Risk IQ — tfoxlaw @ 12:01 am

Oringel - new picEd. Note-Joe Oringel, Principal at Visual Risk IQ recently wrote a series of blog posts on advancing your business through the use of data analytics and audit. I asked Joe if I could repost his articles, which he graciously allowed me to do. So today I begin a day 3-day series of blog posts which reprint his post. Today are Tips 3-4.

 Tip 3- Understanding Your Data

Tip 3 for advancing with audit and compliance analytics is to “Understand Your Data, and Explore it Fully Before Developing Exception Queries.” One common mistake that we see audit and compliance professionals make with data analytics is that they sometimes dive right into searching for transaction exceptions before exploring their data fully. This limits the effectiveness of their analysis, because they are searching for something specific and can overlook other conditions or anomalies in their data. If you’ve not seen the selective attention (aka Gorilla and Basketball) videos from Daniel Simons, here’s a fun link.

Selective attention on exception queries seems to happen due to the strengths of traditional analytics tools like Microsoft Excel and general purpose tools like CaseWare IDEA or ACL. It is less common with Visual Reporting tools like Tableau and Qlikview, in part because these tools are designed to specifically support data exploration and interaction with click and drill-through capabilities. Visual Reporting capabilities are very effective for data exploration, and some rudimentary visual capabilities can be found in Excel, IDEA, and ACL.

During data analytics brainstorming, we categorize analytics queries as Metric Queries, Outlier Queries, and Exception Queries. When prioritizing queries to be built for client assignments, we make sure that there some of each type of query, so that sufficient data exploration takes place before we jump into exception queries or begin researching exceptions.

Metric queries are those analytics such as “Top 10 Vendors by Vendor Spend” or “Top 10 Vendors by Number of Transactions”, or “Top 10 Dates of the Year for Requisitions (or Purchase Orders).” Simply summarizing number and value of transactions by different dimensions (day of week, week of quarter, or by UserID) can identify anomalies that should be questioned further. On a recent Payroll Wage and Hour project, we found unusual patterns of when people punched in and out much more frequently on some minutes (e.g. 7 or 23 minutes past the hour, vs. 8 or 22 minutes past the hour). This condition called for further inquiry and analysis about whether time rounding was fair and equitable for certain types of workers. This condition is in fact a major compliance risk and should be considered for any employers with a significant number of hourly worker. See Corporate Counsel article for more information.

Outlier queries are comparative analytics like “Largest Invoice to Average Invoice, by Vendor,” “Most Expensive Airfare by Distance,” or “Most Expensive Travel / Entertainment Event per Person vs. Average Event per Person.” These outlier queries are also essential, in that they help identify patterns or relationships that should be investigated further. Digital analysis such as Benford’s Law is a well-known audit example of an Outlier query, but there are many more techniques that can yield insight beyond only Benford’s Law.

Example of exception queries are more traditional Analytics queries such as these listed below:

  • List if two (or more) invoices have been paid for the same amount to the same vendor
  • List any purchase orders created after their corresponding invoice
  • List any Vendors who share a Tax ID Number, Address, or Phone Number with an Employee
  • List any Vendors who have had transactions posted after being Terminated or made Inactive

In short, we recommend spending at least an hour and as much as a day or more exploring and analyzing your data, before beginning any Exception Queries. A data exploration checklist follows – any additions or other suggestions to this list are welcome.

  • Sort transactions from oldest to newest and from newest to oldest. Any unusual dates or times? Any gaps in date or time stamps? Why?
  • Sort transactions from largest to smallest and smallest to largest. Any unusual negative values?
  • Stratify by various status codes, reason codes, or transaction types. Are all values consistently completed. Any unusual relationships? What do each of the codes and values represent?
  • Stratify by dollar value ranges. Do 20% of the transactions make up 80% of the value? Should they? The Pareto Principle says yes, but your business may vary.
  • Compute Relative Size Factor (largest to average and largest to second largest), and sort again. Do any of these RSF values cause you to want to drill into specifics? Consider whole numbers and large numbers. Why or why not?

What has been your most significant “aha” moment when exploring your data?

Tip 4 – Considering Outliers

Five tips…#4. Consider metric, outlier, and exception queries

For readers seeing this post as their first of the series, today is actually the fourth of a five-part blog that has been developed in response to Internal Auditor magazine’s lead article titled “The Year Ahead: 2015”. Because so many people make resolutions for the new year, we wanted to help audit and compliance professionals succeed with their resolutions. Especially because we believe there are more than a few whose resolutions include becoming more data-driven in their work through regular use with data analytics.

Yesterday we defined metric, outlier, and exception queries, and provided examples in the context of related potential audit projects around expenses such as Accounts Payable, Travel and Entertainment, or Payroll. To review, metric queries are simply lists of transactions that measure values against various dimensions or strata, such as rank or time series. Top 10 largest or simply transactions by day of week are examples of metric queries. These metric queries are powerful, and can become even more powerful when combined as part of outlier and exception analysis.

One recent Travel and Expense example from our client work was seeing a number of executive assistants in the “Top 10 Travel Spend reports.” Even before we looked at any exception report it became clear that some of the organization’s executives had their assistants complete and submit their personal expense reports, and then approved those reports themselves.

Outlier queries are those that compare value to other values like a mean or standard deviation. As an example, saying that today is twenty degrees colder than average or the coldest day of winter is more informative than saying that it will be sixteen degrees tomorrow than yesterday. Better still, listing the 10 coldest days together in relation to average and standard deviation is even more informative.

We recommend diving into exception queries only after metric and outlier queries have been prepared, explored and analyzed. It’s common for false positives to be averted through thoughtful review of metric and outlier queries.

How does this compare to your experiences?

Joe Oringel is a CPA and CIA with 25 years of experience in internal auditing, fraud detection and forensics. He has over ten years of Big 4 external audit, internal audit, and advisory experience, most recently with PricewaterhouseCoopers. His corporate experience includes information security, internal auditing, and risk and control of large ERP systems for companies in highly regulated industries, including Pharmaceuticals, Utilities, and Financial Services. Partner Kim Jones and Joe founded Visual Risk IQ in 2006 as an advisory firm focused solely on Data Analytics, Visual Reporting, and Continuous Auditing and Monitoring. He can be reached at joe.oringel@visualriskiq.com

This publication contains general information only and is based on the experiences and research of the author. The author is not, by means of this publication, rendering business, legal advice, or other professional advice or services. This publication is not a substitute for such legal advice or services, nor should it be used as a basis for any decision or action that may affect your business. Before making any decision or taking any action that may affect your business, you should consult a qualified legal advisor. The author, his affiliates, and related entities shall not be responsible for any loss sustained by any person or entity that relies on this publication. The Author gives his permission to link, post, distribute, or reference this article for any lawful purpose, provided attribution is made to the author.

© Joe Oringel 2015

February 2, 2015

Five Tips for Advancing with Audit Analytics, Part I

Filed under: Data Analytics,Joe Oringel,Visual Risk IQ — tfoxlaw @ 12:01 am

Oringel - new picEd. Note-Joe Oringel, Principal at Visual Risk IQ recently wrote a series of blog posts on advancing your business through the use of data analytics and audit. I asked Joe if I could repost his articles, which he graciously allowed me to do. So today I begin a day 3-day series of blog posts which reprint his post. Today are Tips 1-2.

For many people today, Monday January 5, is the first work day of 2015. We compliance and audit professionals are like our many co-workers and friends in that we have new goals and ideas that we expect should set this year apart. We want to grow and develop personally and professionally and have even greater career success. Inside and even outside of our current roles. But how?

Even more than in previous years, 2015 is shaping up as the year that Analytics will be adopted by the audit and compliance profession, at least according to Internal Auditor, the global professional journal for internal auditors. See article titled “The Year Ahead: 2015.”

This article quotes several high profile Chief Audit Executives (CAE’s) on the subject of Analytics. Raytheon’s Larry Harrington, a frequent keynote speaker for the IIA says that “you will see greater use of data analytics to increase audit coverage without increasing costs” and that “internal audit will leverage analytics from other lines of defense,” such as compliance and risk management. Increased use of Analytics will lead to greater value from audit and compliance, as measured by management. But if this was easy, wouldn’t we all be doing it already? How should we overcome obstacles such as finding the right people, training, and budgets (as cited by the CAE’s in this article)

Visual Risk IQ has been helping audit and compliance professionals see and understand their data since 2006. We work with all leading audit-specific tools (e.g. CaseWare IDEA, ACL, and newcomer Analyzer, from Arbutus Software), and also with general purpose analytics and visual reporting tools like SQL, Tableau, Oversight, and more. Importantly, we have completed hundreds of engagements for clients across a wide variety of industries.

These five tips are:
1) Consider skills and experience of the team, not individuals, when planning a data analytics project.
2) Begin with the business objectives in mind, and map from these objectives to available data
3) Understand your data, and explore it fully before developing exception queries
4) Consider outlier, metric, and exception queries
5) Supplement necessary skills with internal or external resources

We’ll be expanding on each of these five tips in blog posts later this week, but here is some information on the first and perhaps most important one.

Tip 1 – Your People 

You should consider skills and experience of the Team, not individuals, when planning a data analytics project.

As part of our consulting projects, and for our inward assessment of our own team members, we use an analytics-focused Body of Knowledge framework that has the following seven key components.

  • Project Management
  • Data Acquisition and Manipulation
  • Statistical techniques
  • Visual Reporting techniques
  • Communication
  • Audit and Compliance Domain expertise
  • Change Management and Strategic Thinking

In our experience, data analytics projects succeed because of project expectations and corresponding competencies of team members in these seven areas. It’s especially important to note that these body of knowledge components are rarely (if ever?) found at a high level within a single individual, and therefore a team approach is needed to accomplish successful an analytics projects. 

People that have greater skills at project management or communication of issues may not have the requisite technical experience when it comes to data acquisition and manipulation, or statistical techniques. Similarly, it is common for stronger data specialists to be weaker on audit or compliance domain expertise.

So when planning an audit analytics project, be sure that you’ve built a team that has each of these key elements in their skill set, and that they have the incentives and team structure to work together and learn from each other’s expertise.

Tip 2-Brainstorming

Yesterday we started a multi-part post on the importance of building audit data analytics capabilities, together with some “how-to” tips. Our first tip was how this is actually as much of a people challenge as a technical undertaking. One particular “secret” is that a combination of skills are needed to accomplish these analytics projects, and we see many departments make the mistake of assigning a single individual to carry out a project, without sufficient assistance or at least oversight from colleagues that have complementary skills.

In our data analytics consulting practice, we use a Body of Knowledge framework to identify needed skills for a particular project, and then match at least one “expert” with an “apprentice” that is looking to add to these same skills. Together our teams bring excellent qualifications in each of these domains, but it’s rare that they all arrive in the form of a single consultant. That framework was published here yesterday.

Today’s tip is to “Begin with the business objectives in mind, and map from these objectives to available digital data.” Too often, we see compliance and audit teams request data and begin to interrogate it before understanding the data fully or taking steps to validate control totals and/or data completeness. A related mistake is to exhaustively test a single data file without considering supplemental data sources that may yield greater insight or answer related business questions.

A recent example of why to begin with business questions was a Payroll project that we completed for a retail client. Our team was tasked with searching for “off-the-clock” work. If we had focused only on available data files, we could have answered questions about meal breaks, rest breaks, and overtime but perhaps missed other hours worked but not paid. By focusing on the business question first, we identified badge data and cash register data to identify if employees were in the store and ringing sales, yet were off the clock at the time of badge swipes or point-of-sale,

As such, the first step in any data analytics project is brainstorming. You can think of it as part of project planning. During this step, teams should identify the business questions that they want to answer with their analytics efforts, and cross-reference these business questions against available reports and digital data. If existing report(s) fully answer a business question, then a new query may not needed. But if a report does not currently exist, then analytics should be considered and understanding data sources becomes a key next step. During brainstorming, it is very important to understand the number and complexity and number of data sources that will be needed, and to focus only on a small enough number of business objectives so that the number of data sources does not get overwhelming. It is better to have a series of “small win” analytics efforts, than a larger, less successful project.

Joe Oringel is a CPA and CIA with 25 years of experience in internal auditing, fraud detection and forensics. He has over ten years of Big 4 external audit, internal audit, and advisory experience, most recently with PricewaterhouseCoopers. His corporate experience includes information security, internal auditing, and risk and control of large ERP systems for companies in highly regulated industries, including Pharmaceuticals, Utilities, and Financial Services. Partner Kim Jones and Joe founded Visual Risk IQ in 2006 as an advisory firm focused solely on Data Analytics, Visual Reporting, and Continuous Auditing and Monitoring. He can be reached at joe.oringel@visualriskiq.com

This publication contains general information only and is based on the experiences and research of the author. The author is not, by means of this publication, rendering business, legal advice, or other professional advice or services. This publication is not a substitute for such legal advice or services, nor should it be used as a basis for any decision or action that may affect your business. Before making any decision or taking any action that may affect your business, you should consult a qualified legal advisor. The author, his affiliates, and related entities shall not be responsible for any loss sustained by any person or entity that relies on this publication. The Author gives his permission to link, post, distribute, or reference this article for any lawful purpose, provided attribution is made to the author.

 

© Joe Oringel 2015

Blog at WordPress.com.