<?xml version='1.0' encoding='UTF-8'?><codeBook xmlns="ddi:codebook:2_5" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="ddi:codebook:2_5 https://ddialliance.org/Specification/DDI-Codebook/2.5/XMLSchema/codebook.xsd" version="2.5"><docDscr><citation><titlStmt><titl>An overview and implementation of extraction-transformation-loading (ETL) process in data warehouse (Case study: Department of agriculture)</titl><IDNo agency="DOI">doi:10.34820/FK2/KJ9FLS</IDNo></titlStmt><distStmt><distrbtr source="archive">Telkom University Dataverse</distrbtr><distDate>2023-10-05</distDate></distStmt><verStmt source="archive"><version date="2023-10-05" type="RELEASED">1</version></verStmt><biblCit>wijaya, Rahmadi Wijaya, 2023, "An overview and implementation of extraction-transformation-loading (ETL) process in data warehouse (Case study: Department of agriculture)", https://doi.org/10.34820/FK2/KJ9FLS, Telkom University Dataverse, V1</biblCit></citation></docDscr><stdyDscr><citation><titlStmt><titl>An overview and implementation of extraction-transformation-loading (ETL) process in data warehouse (Case study: Department of agriculture)</titl><IDNo agency="DOI">doi:10.34820/FK2/KJ9FLS</IDNo></titlStmt><rspStmt><AuthEnty affiliation="Fakultas Ilmu Terapan">wijaya, Rahmadi Wijaya</AuthEnty></rspStmt><prodStmt/><distStmt><distrbtr source="archive">Telkom University Dataverse</distrbtr><contact affiliation="Fakultas Ilmu Terapan" email="rahmadi@telkomuniversity.ac.id">wijaya, Rahmadi Wijaya</contact><depositr>wijaya, Rahmadi Wijaya</depositr><depDate>2022-04-02</depDate></distStmt></citation><stdyInfo><subject><keyword>Agricultural Sciences</keyword></subject><abstract>Extraction-transformation-loading (ETL) process in data warehouse development perform data extraction from various resources, transform the data into suitable format and loadit into data warehouse storage. In the ETL process, there is data cleansing process function that handles redundancy, inconsistency and integrity data. ETL process will move data from the source to the integration layer (data store in data warehouse). In the integration layer, the data can be grouped into smaller scope and more specific for the requirement in other repositories called data marts. Reporting program of data warehouse will be associated with a data mart as its data source. In this research, the data warehouse is built to handle the ETL process. The data warehouse build metadata to support the process. The metadata construction for ETL processes will lead to ETL programs with high degree of reusability. The conclusion from this research is the use of dynamic ETL process (using metadata ETL) is required when ETL process is dealing with the operational system that still unstable and likely to change the database schema. Dynamic ETL process is also needed to address the increase requirement for report from the users.</abstract><sumDscr/></stdyInfo><method><dataColl><sources/></dataColl><anlyInfo/></method><dataAccs><notes type="DVN:TOU" level="dv">CC0 Waiver</notes><setAvail/><useStmt/></dataAccs><othrStdyMat/></stdyDscr></codeBook>