Srinithya Chikyala's Profile
Expert
50
Points

Questions
0

Answers
48

  • Expert Asked on August 8, 2019 in Diyotta Studio.

    Considering General Diyotta rest API extraction and loading into any database. Below are the steps to be followed.

      1. Create Rest Datapoint to configure connectivity to Restful API services. Choose required authentication Type and accordingly authentication details like user credentials, access key etc needs to be provided. Give REST Webservices URI based on the URI you want to extract data from.Refer https://help.shopify.com/en/api/reference to find the shopify URL for the tables.

    RE: Extracting data from a REST API data point

    2. Based on the response from webservices URI create schema file for the response and use it in import Data object wizard to create the Data Object. File, JSON or XML formats are supported as Rest objects. The Rest API data point will be associated to File, JSON or XML data objects which would represent the format and structure of response from Restful API service.Orders, customers etc API’s can be used from shopify. Below is facebook datafile and json data object reference,

    RE: Extracting data from a REST API data point

    JSON schema file from above response have been created and imported in Diyotta to create  JSON data object.

    RE: Extracting data from a REST API data point

    3.Create a dataflow with rest source( FILE/JSON/XML) and target as required database where data needs to be loaded.

    4. Select source–>Click on Rest Properties and give rest extraction details as per requirement. Rest headers can also be included here.

    RE: Extracting data from a REST API data point

    RE: Extracting data from a REST API data point

    • 403 views
    • 1 answers
    • 0 votes
  • Expert Asked on July 26, 2019 in Diyotta Scheduler.

    To send emails along with logs using scheduler, email event can be created by enabling ‘Attach logs’ and then assign it to command task/Filewatcher t which email needs to be sent. Below are the steps.

    1. Create an email event and check ‘attach logs’ along with other details.
    2. RE: In Scheduler, how to get email notification along with logs as attachment?

    3. Create/open command task/file watcher task and select notification tab.
    4. Add email event created and select the condition to which email needs to be sent.
    5. RE: In Scheduler, how to get email notification along with logs as attachment?

    6. Schedule the task and email will be sent along with logs once condition is met.
    • 198 views
    • 1 answers
    • 0 votes
  • Expert Asked on July 26, 2019 in Diyotta Studio.

    This issue shows up when tform database is not selected while creating datapoint choosen for dataflow’s native processing platform. Below are the steps to fix it.

    1. Go to Splice machine data point
    2. Add any database and Check tform(transform) database which is used to create temperorary tables at the runtime..
    3. RE: Error when creating data flow - *Transform database is not assigned to the Data Point.

    • 350 views
    • 2 answers
    • 0 votes
  • Expert Asked on July 24, 2019 in Diyotta Studio.

    This can be achieved in multiple ways using parameterization concepts for file name based on the exact requirement. Below is an basic solution to achieve it,

    1. Create a dataflow with File as a source and required database as a target.

    2. Select File source instance in dataflow and click on ‘Extraction Properties’ tab in design tab.

    3. Give ‘File Name’ with pattern as only source system name_* and choose ‘File data reference’ as wildcard.

    RE: How can I specify a pattern to pick and read a file without specifying the full file name?

    4. Execute the dataflow.

    5. Further it can be scheduled in schedular module to trigger dataflow automatically everyday.

    • 249 views
    • 1 answers
    • 0 votes
  • Expert Asked on July 17, 2019 in Diyotta Agent.

    Organization is a special feature in Diyotta, if want to assign specific connectors to agent. In this case that agent will be assigned only for that type of connector at studio level.

     

    RE: What is Organization in Diyotta Agent details?

    • 208 views
    • 2 answers
    • 0 votes
  • Expert Asked on July 17, 2019 in Architecture.

    Diyotta integrates with ‘Git’ for version control. Below points outline the architecture for this integration

    • Java API used to perform git operations; no external dependency required on agent/controller nodes
    • On configuring version control from the Admin module, Diyotta creates a git local repository in the path defined in configurations
    • Commit operation, exports the JSON specification of the data flow, adds and commits the same in the local repository
    • ‘bitbucket’ and ‘github’ are supported for remote repository
    • Push operation pushes changes from specified branch of local repo to specified branch of remote repository
    • Pull operation gets latest changes from specified branch from remote repo to specific branch of local repo
    • Merge operation on branches is out of scope of Diyotta and should be done from the backend
    • 237 views
    • 1 answers
    • 0 votes
  • Expert Asked on July 17, 2019 in Architecture.

    Diyotta does allow using multiple Hadoop clusters of different versions simultaneously. Diyotta data point seeks the version information of the Hadoop distribution along with other connection information. For each combination of version and distribution, a separate agent service is spun off from the parent agent with its own dependencies. This avoids any conflict between dependencies of different versions.

    • 182 views
    • 1 answers
    • 0 votes
  • Expert Asked on July 17, 2019 in Diyotta Controller.

    To enable https on Diyotta Controller, should modify server.xml present at $DIYOTTA_HOME/controller/server/tomcat/conf

    Connector in server.xml.

    <Connector port=”${port.startup}” maxThreads=”200″

    scheme=”https” secure=”true” SSLEnabled=”true”

    keystoreFile=”<Path to certificate>” keystorePass=”*****”

    clientAuth=”false” sslProtocol=”TLS”/>

    • 195 views
    • 1 answers
    • 0 votes
  • Expert Asked on July 17, 2019 in Diyotta Controller.

    Diyotta controller communicates to agents using controller broker. Whenever controller wants to send message to agent, it keeps the serialized message in controller broker queue, that will be listen by the Diyotta agent as well as there is a message listener at controller to listen agent messages from the queue.

    • 174 views
    • 1 answers
    • 0 votes
  • Expert Asked on July 17, 2019 in Diyotta Studio.

    We can process avro file from hdfs location by using property which will create external table on avro file, we need to go to data object and under properties, select Create External Table=Yes and provide WareHouse Location (that location where avro file was stored) and select File Format=AVRO. Then external table will be created and then we can access the data from that table for further operations.

     

    RE: How to process avro data file from hdfs to external target in diyotta?

    • 177 views
    • 1 answers
    • 0 votes