Send your Resume at : hr@ivispl.com & sudha.s@ivispl.com

Work Location: 5851 Legacy Circle, Suite 600, Plano, TX 75024

Job Resposibilities:

  • Gather report requirements: Compile thorough, easy-to-reference, information about reports, such as: Purpose, audience, and expected action: Identify the purpose and business process applicable to each report, as well as the audience, analytical workflow, and expected action to be taken by report consumers.
  • Consider sitting with report consumers of the existing report to understand exactly what they do with it. You may learn that certain elements of the report can be eliminated or improved in the new Power BI version. This process involves additional time investment but it's valuable for critical reports or reports that are used often.
  • Identify the report owner and any subject matter expert(s) associated with the report or data domain. They may become the owners of the new Power BI report going forward. Include any specific change management requirements (which typically differ between IT-managed and business-managed solutions) as well as approvals and signoffs, which will be required when changes are made in the future.
  • Clarify report consumer expectations for content delivery. It may be on-demand, interactive execution, embedded within a custom application, or delivery on a schedule using an e-mail subscription. There may also be requirements to trigger alert notifications.
  • Determine must-have and nice-to-have interactivity requirements, such as filters, drill-down actions, or drill through actions.
  • Ensure all data sources required by the report are discovered, and data latency needs (data freshness) are understood. Identify historical data, trending, and data snapshot requirements for each report so they can be aligned with the data requirements. Data source documentation can also be useful later on when performing data validation of a new report with its source data.
  • Clarify security requirements (such as allowed viewers, allowed editors, and any row-level security needs), including any exceptions to normal organizational security. Document any data sensitivity level, data privacy, or regulatory/compliance needs.
  • Calculations, KPIs, and business rules: Identify and document all calculations, KPIs, and business rules that are currently defined within the existing report so they can be aligned with the data requirements.
  • Usability, layout, and cosmetic requirements: Identify specific usability, layout, and cosmetic needs related to data visualizations, grouping and sorting requirements, and conditional visibility. Include any specific considerations related to mobile device delivery.
  • Printing and exporting needs: Determine whether there are any requirements specific to printing, exporting, or pixel-perfect layout. These needs will influence which type of report will be most suitable (such as a Power BI, Excel, or paginated report). Be aware that report consumers tend to place a lot of importance on how they've always done things, so don't be afraid to challenge their way of thinking. Be sure to talk in terms of enhancements rather than change.
  • Risks or concerns: Determine whether there are other technical or functional requirements for reports, as well as any risks or concerns regarding the information being presented in them.
  • Open issues and backlog items: Identify any future maintenance, known issues, or deferred requests to add to the backlog at this time.
  • Existing queries: Identify whether there are existing report queries or stored procedures that can be used by a Direct Query model or a Composite model, or can be converted to an Import model.
  • Types of data sources: Compile the types of data sources that are necessary, including centralized data sources (such as an enterprise data warehouse) as well as non-standard data sources (such as flat files or Excel files that augment enterprise data sources for reporting purposes). Finding where data sources are located, for purposes of data gateway connectivity, is important too.
  • Data structure and cleansing needs: Determine the data structure for each requisite data source, and to what extent data cleansing activities are necessary.
  • Data integration: Assess how data integration will be handled when there are multiple data sources, and how relationships can be defined between each model table. Identify specific data elements needed to simplify the model and reduce its size.
  • Determine the data latency needs for each data source. It will influence decisions about which data storage mode to use. Data refresh frequency for Import model tables is important to know too.
  • Evaluate data volume expectations, which will factor into decisions about large model support and designing Direct Query or Composite models. Considerations related to historical data needs are essential to know too. For larger datasets, determining incremental data refresh will also be necessary.
  • Assess needs for measures, KPIs, and business rules. They will impact decisions regarding where to apply the logic: in the dataset or the data integration process.
  • Consider whether there are master data issues requiring attention. Determine if integration with an enterprise data Catalog is appropriate for enhancing discoverability, accessing definitions, or producing consistent terminology accepted by the organization.
  • Determine whether there are any specific security or data privacy considerations for datasets, including row-level security requirements.
  • Minimum Requirements:
  • To perform these duties, the necessary background is typically acquired through a bachelor’s degree in computer science or any other related field.
TOP