Pentaho 3.2 Data Integration: Beginner's Guide. Explore, transform, validate, and integrate your data with ease
- Autorzy:
- María Carina Roldán, Doug Moran
- Ocena:
- Bądź pierwszym, który oceni tę książkę
- Stron:
- 492
- Dostępne formaty:
-
PDFePubMobi
Opis ebooka: Pentaho 3.2 Data Integration: Beginner's Guide. Explore, transform, validate, and integrate your data with ease
Wybrane bestsellery
-
To trzecie wydanie przewodnika autorstwa twórców Kubernetesa. Zostało starannie zaktualizowane i wzbogacone o tak ważne zagadnienia jak bezpieczeństwo, dostęp do Kubernetesa za pomocą kodu napisanego w różnych językach programowania czy tworzenie aplikacji wieloklastrowych. Dzięki książce poznasz...
Kubernetes. Tworzenie niezawodnych systemów rozproszonych. Wydanie III Kubernetes. Tworzenie niezawodnych systemów rozproszonych. Wydanie III
(41.40 zł najniższa cena z 30 dni)44.85 zł
69.00 zł(-35%) -
This edition is a comprehensive guide to design architecture and implement practices for delivering business value. You’ll learn concepts influencing architectural decisions, and topics like observability, security and running in multiple regions.
Software Architecture Patterns for Serverless Systems. Architecting for innovation with event-driven microservices and micro frontends - Second Edition Software Architecture Patterns for Serverless Systems. Architecting for innovation with event-driven microservices and micro frontends - Second Edition
-
This book offers extensive practice questions, meticulously arranged to prepare you for A+ Core 1 and Core 2 exams, ensuring you’re fully ready. The questions are evenly distributed based on the significance of each exam domain.
CompTIA A+ Practice Tests Core 1 (220-1101) and Core 2 (220-1102). Pass the CompTIA A+ exams on your first attempt with rigorous practice questions CompTIA A+ Practice Tests Core 1 (220-1101) and Core 2 (220-1102). Pass the CompTIA A+ exams on your first attempt with rigorous practice questions
-
Linux system administration is a constantly evolving field that can be hard for newcomers to navigate. This book takes you from the basics to the modern Linux environment, covering enterprise features such as centralized authentication, automation with Chef, clustering, and much more.
Linux for System Administrators. Navigate the complex landscape of the Linux OS and command line for effective administration Linux for System Administrators. Navigate the complex landscape of the Linux OS and command line for effective administration
-
Ever wonder why speech recognition systems don't understand the Scottish accent, or what would happen if an astronaut only ate mac 'n' cheese, or other spurious reflections you'd have at a bar? We did, then collated those deliberations into absurd research articles with fake figures and methodolo...
Et al. Because not all research deserves a Nobel Prize Et al. Because not all research deserves a Nobel Prize
-
Every organization wants an agile, performant, and cost-effective data platform that meets all their current and future business needs. Purpose-built AWS analytics services and their features play a big part in building such a modern data platform. This book brings to you all the design and archi...
Modern Data Architecture on AWS. A Practical Guide for Building Next-Gen Data Platforms on AWS Modern Data Architecture on AWS. A Practical Guide for Building Next-Gen Data Platforms on AWS
-
Ruby on Rails is an open-source framework for building web applications from scratch and develop a full-featured product. This book helps you keep the code maintainable while working on a Rails application by demonstrating the useful design patterns and exploring the common code abstractions.
Layered Design for Ruby on Rails Applications. Discover practical design patterns for maintainable web applications Layered Design for Ruby on Rails Applications. Discover practical design patterns for maintainable web applications
-
Explore mathematical concepts and approaches to modeling, alongside machine learning techniques, to understand the relevance of models in solving business problems. This book covers various models, tools to uncover meaningful insights, and the perfect blend of machine learning and mathematical mo...
A Handbook of Mathematical Models with Python. Elevate your machine learning projects with NetworkX, PuLP, and linalg A Handbook of Mathematical Models with Python. Elevate your machine learning projects with NetworkX, PuLP, and linalg
-
Strategizing Continuous Delivery in Cloud will help you work on CI/CD and cloud-native together ensuring that your teams can operate in different types of cloud environments. You’ll be capable of releasing more efficient software by upgrading and modernizing continuous delivery.
Strategizing Continuous Delivery in the Cloud. Implement continuous delivery using modern cloud-native technology Strategizing Continuous Delivery in the Cloud. Implement continuous delivery using modern cloud-native technology
María Carina Roldán, Doug Moran - pozostałe książki
-
Pentaho Data Integration (PDI, also called Kettle), one of the data integration tools leaders, is broadly used for all kind of data manipulation such as migrating data between applications or databases, exporting data from databases to flat files, data cleansing, and much more. Do you need quick ...
Pentaho Data Integration 4 Cookbook. Over 70 recipes to solve ETL problems using Pentaho Kettle Pentaho Data Integration 4 Cookbook. Over 70 recipes to solve ETL problems using Pentaho Kettle
Adri?É?íÂ!°n Sergio Pulvirenti, Lance Walter, María Carina Roldán, Adrian Sergio Pulvirenti
-
Pentaho Data Integration(PDI) is an intuitive and graphical environment packed with drag-and-drop design and powerful Extract-Transform-Load (ETL) capabilities. Given its power and flexibility, first attempts to use the Pentaho Data Integration tool can be difficult or confusing. This book is the...
Pentaho Data Integration Quick Start Guide. Create ETL processes using Pentaho Pentaho Data Integration Quick Start Guide. Create ETL processes using Pentaho
-
Pentaho Data Integration (PDI) has an intuitive design environment with powerful ETL capabilities. However, getting started with the tool can be difficult. This book provides the necessary guidance needed to overcome this, bringing together all the new features of Pentaho 8 Data Integration for q...
Learning Pentaho Data Integration 8 CE. An end-to-end guide to exploring, transforming, and integrating your data across multiple sources - Third Edition Learning Pentaho Data Integration 8 CE. An end-to-end guide to exploring, transforming, and integrating your data across multiple sources - Third Edition
-
Pentaho Data Integration is the premier open source ETL tool, providing easy, fast, and effective ways to move and transform data. While PDI is relatively easy to pick up, it can take time to learn the best practices so you can design your transformations to process data faster and more efficient...
Pentaho Data Integration Cookbook. The premier open source ETL tool is at your command with this recipe-packed cookbook. Learn to use data sources in Kettle, avoid pitfalls, and dig out the advanced features of Pentaho Data Integration the easy way. - Second Edition Pentaho Data Integration Cookbook. The premier open source ETL tool is at your command with this recipe-packed cookbook. Learn to use data sources in Kettle, avoid pitfalls, and dig out the advanced features of Pentaho Data Integration the easy way. - Second Edition
-
Capturing, manipulating, cleansing, transferring, and loading data effectively are the prime requirements in every IT organization. Achieving these tasks require people devoted to developing extensive software programs, or investing in ETL or data integration tools that can simplify this work.Pen...
Pentaho Data Integration Beginner's Guide. Get up and running with the Pentaho Data Integration tool using this hands-on, easy-to-read guide with this book and ebook - Second Edition Pentaho Data Integration Beginner's Guide. Get up and running with the Pentaho Data Integration tool using this hands-on, easy-to-read guide with this book and ebook - Second Edition
Ebooka "Pentaho 3.2 Data Integration: Beginner's Guide. Explore, transform, validate, and integrate your data with ease" przeczytasz na:
-
czytnikach Inkbook, Kindle, Pocketbook, Onyx Boox i innych
-
systemach Windows, MacOS i innych
-
systemach Windows, Android, iOS, HarmonyOS
-
na dowolnych urządzeniach i aplikacjach obsługujących formaty: PDF, EPub, Mobi
Masz pytania? Zajrzyj do zakładki Pomoc »
Audiobooka "Pentaho 3.2 Data Integration: Beginner's Guide. Explore, transform, validate, and integrate your data with ease" posłuchasz:
-
w aplikacji Ebookpoint na Android, iOS, HarmonyOs
-
na systemach Windows, MacOS i innych
-
na dowolnych urządzeniach i aplikacjach obsługujących format MP3 (pliki spakowane w ZIP)
Masz pytania? Zajrzyj do zakładki Pomoc »
Kurs Video "Pentaho 3.2 Data Integration: Beginner's Guide. Explore, transform, validate, and integrate your data with ease" zobaczysz:
-
w aplikacjach Ebookpoint i Videopoint na Android, iOS, HarmonyOs
-
na systemach Windows, MacOS i innych z dostępem do najnowszej wersji Twojej przeglądarki internetowej
Szczegóły ebooka
- Tytuł oryginału:
- Pentaho 3.2 Data Integration: Beginner's Guide. Explore, transform, validate, and integrate your data with ease
- ISBN Ebooka:
- 978-18-471-9955-3, 9781847199553
- Data wydania ebooka:
- 2010-04-04 Data wydania ebooka często jest dniem wprowadzenia tytułu do sprzedaży i może nie być równoznaczna z datą wydania książki papierowej. Dodatkowe informacje możesz znaleźć w darmowym fragmencie. Jeśli masz wątpliwości skontaktuj się z nami sklep@ebookpoint.pl.
- Język publikacji:
- angielski
- Rozmiar pliku Pdf:
- 9.6MB
- Rozmiar pliku ePub:
- 15.9MB
- Rozmiar pliku Mobi:
- 25.8MB
Spis treści ebooka
- Pentaho 3.2 Data Integration Beginners Guide
- Table of Contents
- Pentaho 3.2 Data Integration Beginner's Guide
- Credits
- Foreword
- The Kettle Project
- About the Author
- About the Reviewers
- Preface
- How to read this book
- What this book covers
- What you need for this book
- Who this book is for
- Conventions
- Reader feedback
- Customer support
- Errata
- Piracy
- Questions
- 1. Getting Started with Pentaho Data Integration
- Pentaho Data Integration and Pentaho BI Suite
- Exploring the Pentaho Demo
- Pentaho Data Integration and Pentaho BI Suite
- Pentaho Data Integration
- Using PDI in real world scenarios
- Loading datawarehouses or datamarts
- Integrating data
- Data cleansing
- Migrating information
- Exporting data
- Integrating PDI using Pentaho BI
- Pop quiz PDI data sources
- Loading datawarehouses or datamarts
- Using PDI in real world scenarios
- Installing PDI
- Time for action installing PDI
- What just happened?
- Pop quiz PDI prerequisites
- Launching the PDI graphical designer: Spoon
- Time for action starting and customizing Spoon
- What just happened?
- Spoon
- Setting preferences in the Options window
- Storing transformations and jobs in a repository
- Creating your first transformation
- Time for action creating a hello world transformation
- What just happened?
- Directing the Kettle engine with transformations
- Exploring the Spoon interface
- Viewing the transformation structure
- Running and previewing the transformation
- What just happened?
- Time for action running and previewing the hello_world transformation
- What just happened?
- Previewing the results in the Execution Results window
- What just happened?
- Pop quiz PDI basics
- Installing MySQL
- Time for action installing MySQL on Windows
- What just happened?
- Time for action installing MySQL on Ubuntu
- What just happened?
- Summary
- 2. Getting Started with Transformations
- Reading data from files
- Time for action reading results of football matches from files
- What just happened?
- Input files
- Input steps
- Reading several files at once
- Time for action reading all your files at a time using a single Text file input step
- What just happened?
- Time for action reading all your files at a time using a single Text file input step and regular expressions
- What just happened?
- Regular expressions
- Troubleshooting reading files
- What just happened?
- Grids
- Have a go hero explore your own files
- Sending data to files
- Time for action sending the results of matches to a plain file
- What just happened?
- Output files
- Output steps
- Some data definitions
- Rowset
- Streams
- The Select values step
- Have a go hero extending your transformations by writing output files
- Getting system information
- Time for action updating a file with news about examinations
- What just happened?
- Getting information by using Get System Info step
- Data types
- Date fields
- Numeric fields
- Running transformations from a terminal window
- Time for action running the examination transformation from a terminal window
- What just happened?
- Have a go hero using different date formats
- Go for a hero formatting 99.55
- Pop quizformatting data
- XML files
- Time for action getting data from an XML file with information about countries
- What just happened?
- What is XML
- PDI transformation files
- Getting data from XML files
- XPath
- Configuring the Get data from XML step
- Kettle variables
- How and when you can use variables
- Have a go hero exploring XML files
- Have a go hero enhancing the output countries file
- Have a go hero documenting your work
- Summary
- 3. Basic Data Manipulation
- Basic calculations
- Time for action reviewing examinations by using the Calculator step
- What just happened?
- Adding or modifying fields by using different PDI steps
- The Calculator step
- The Formula step
- Time for action reviewing examinations by using the Formula step
- What just happened?
- Have a go hero listing students and their examinations results
- Pop quiz concatenating strings
- Calculations on groups of rows
- Time for action calculating World Cup statistics by grouping data
- What just happened?
- Group by step
- Have a go hero calculating statistics for the examinations
- Have a go hero listing the languages spoken by country
- Filtering
- Time for action counting frequent words by filtering
- What just happened?
- Filtering rows using the Filter rows step
- Have a go hero playing with filters
- Have a go hero counting words and discarding those that are commonly used
- Looking up data
- Time for action finding out which language people speak
- What just happened?
- The Stream lookup step
- Have a go hero counting words more precisely
- Summary
- 4. Controlling the Flow of Data
- Splitting streams
- Time for action browsing new PDI features by copyinga dataset
- What just happened?
- Copying rows
- Have a go hero recalculating statistics
- Distributing rows
- Time for action assigning tasks by distributing
- What just happened?
- Pop quiz data movement (copying and distributing)
- Splitting the stream based on conditions
- Time for action assigning tasks by filtering priorities with the Filter rows step
- What just happened?
- PDI steps for splitting the stream based on conditions
- Time for action assigning tasks by filtering priorities with the Switch/ Case step
- What just happened?
- Have a go hero listing languages and countries
- Pop quiz splitting a stream
- Merging streams
- Time for action gathering progress and merging all together
- What just happened?
- PDI options for merging streams
- Time for action giving priority to Bouchard by using Append Stream
- What just happened?
- Have a go hero sorting and merging all tasks
- Have a go hero trying to find missing countries
- Summary
- 5. Transforming Your Data with JavaScript Code and the JavaScript Step
- Doing simple tasks with the JavaScript step
- Time for action calculating scores with JavaScript
- What just happened?
- Using the JavaScript language in PDI
- Inserting JavaScript code using the Modified Java Script Value step
- Adding fields
- Modifying fields
- Turning on the compatibility switch
- Have a go hero adding and modifying fields to the contest data
- Testing your code
- Time for action testing the calculation of averages
- What just happened?
- Testing the script using the Test script button
- What just happened?
- Have a go hero testing the new calculation of the average
- Enriching the code
- Time for action calculating flexible scores by using variables
- What just happened?
- Using named parameters
- Using the special Start, Main, and End scripts
- Using transformation predefined constants
- Pop quiz finding the 7 errors
- Have a go hero keeping the top 10 performances
- Have a go hero calculating scores with Java code
- Reading and parsing unstructured files
- Time for action changing a list of house descriptions with JavaScript
- What just happened?
- Looking at previous rows
- Have a go hero enhancing the houses file
- Have a go hero fill gaps in the contest file
- Avoiding coding by using purpose-built steps
- Have a go hero creating alternative solutions
- Summary
- 6. Transforming the Row Set
- Converting rows to columns
- Time for action enhancing a films file by converting rows to columns
- What just happened?
- Converting row data to column data by using the Row denormalizer step
- Have a go hero houses revisited
- Aggregating data with a Row denormalizer step
- Time for action calculating total scores by performances by country
- What just happened?
- Using Row denormalizer for aggregating data
- Have a go hero calculating scores by skill by continent
- Normalizing data
- Time for action enhancing the matches file by normalizing the dataset
- What just happened?
- Modifying the dataset with a Row Normalizer step
- Summarizing the PDI steps that operate on sets of rows
- Have a go hero verifying the benefits of normalization
- Have a go hero normalizing the Films file
- Have a go hero calculating scores by judge
- Generating a custom time dimension dataset by using Kettle variables
- Time for action creating the time dimension dataset
- What just happened?
- Getting variables
- Time for action getting variables for setting the default starting date
- What just happened?
- Using the Get Variables step
- What just happened?
- Have a go hero enhancing the time dimension
- Pop quiz using Kettle variables inside transformations
- Summary
- 7. Validating Data and Handling Errors
- Capturing errors
- Time for action capturing errors while calculating the ageof a film
- What just happened?
- Using PDI error handling functionality
- Aborting a transformation
- Time for action aborting when there are too many errors
- What just happened?
- Aborting a transformation using the Abort step
- What just happened?
- Fixing captured errors
- Time for action treating errors that may appear
- What just happened?
- Treating rows coming to the error stream
- What just happened?
- Pop quiz PDI error handling
- Have a go hero capturing errors while seeing who wins
- Avoiding unexpected errors by validating data
- Time for action validating genres with a Regex Evaluation step
- What just happened?
- Validating data
- Time for action checking films file with the Data Validator
- What just happened?
- Defining simple validation rules using the Data Validator
- What just happened?
- Have a go hero validating the football matches file
- Cleansing data
- Have a go hero cleansing films data
- Summary
- 8. Working with Databases
- Introducing the Steel Wheels sample database
- Connecting to the Steel Wheels database
- Introducing the Steel Wheels sample database
- Time for action creating a connection with the Steel Wheels database
- What just happened?
- Connecting with Relational Database Management Systems
- What just happened?
- Pop quiz defining database connections
- Have a go hero connecting to your own databases
- Exploring the Steel Wheels database
- Time for action exploring the sample database
- What just happened?
- A brief word about SQL
- Exploring any configured database with the PDI Database explorer
- What just happened?
- Have a go hero exploring the sample data in depth
- Have a go hero exploring your own databases
- Querying a database
- Time for action getting data about shipped orders
- What just happened?
- Getting data from the database with the Table input step
- Using the SELECT statement for generating a new dataset
- Making flexible queries by using parameters
- Time for action getting orders in a range of dates by using parameters
- What just happened?
- Adding parameters to your queries
- Making flexible queries by using Kettle variables
- What just happened?
- Time for action getting orders in a range of dates by using variables
- What just happened?
- Using Kettle variables in your queries
- What just happened?
- Pop quiz database datatypes versus PDI datatypes
- Have a go hero querying the sample data
- Sending data to a database
- Time for action loading a table with a list of manufacturers
- What just happened?
- Inserting new data into a database table with the Table output step
- Inserting or updating data by using other PDI steps
- Time for action inserting new products or updating existent ones
- What just happened?
- Time for action testing the update of existing products
- What just happened?
- Inserting or updating data with the Insert/Update step
- What just happened?
- Have a go hero populating a films database
- Have a go hero creating the time dimension
- Have a go hero populating the products table
- Pop quiz Insert/Update step versus Table Output/Update steps
- Pop quiz filtering the first 10 rows
- Eliminating data from a database
- Time for action deleting data about discontinued items
- What just happened?
- Deleting records of a database table with the Delete step
- Have a go hero deleting old orders
- Summary
- 9. Performing Advanced Operations with Databases
- Preparing the environment
- Time for action populating the Jigsaw database
- What just happened?
- Exploring the Jigsaw database model
- Looking up data in a database
- Doing simple lookups
- Time for action using a Database lookup step to create a list of products to buy
- What just happened?
- Looking up values in a database with the Database lookup step
- What just happened?
- Have a go hero preparing the delivery of the products
- Have a go hero refining the transformation
- Doing complex lookups
- Time for action using a Database join step to create a list of suggested products to buy
- What just happened?
- Joining data from the database to the stream data by using a Database join step
- What just happened?
- Have a go hero rebuilding the list of customers
- Introducing dimensional modeling
- Loading dimensions with data
- Time for action loading a region dimension with a Combination lookup/update step
- What just happened?
- Time for action testing the transformation that loads the region dimension
- What just happened?
- Describing data with dimensions
- Loading Type I SCD with a Combination lookup/update step
- Have a go hero adding regions to the Region Dimension
- Have a go hero loading the manufacturers dimension
- Have a go hero loading a mini-dimension
- Keeping a history of changes
- Time for action keeping a history of product changes with the Dimension lookup/update step
- What just happened?
- Time for action testing the transformation that keeps a historyof product changes
- What just happened?
- Keeping an entire history of data with a Type II slowly changing dimension
- What just happened?
- Loading Type II SCDs with the Dimension lookup/update step
- Have a go hero keeping a history just for the theme of a product
- Have a go hero loading a Type II SCD dimension
- Pop quiz loading slowly changing dimensions
- Pop quiz loading type III slowly changing dimensions
- Summary
- 10. Creating Basic Task Flows
- Introducing PDI jobs
- Time for action creating a simple hello world job
- What just happened?
- Executing processes with PDI jobs
- Using Spoon to design and run jobs
- Using the transformation job entry
- Pop quiz defining PDI jobs
- Have a go hero loading the dimension tables
- Receiving arguments and parameters in a job
- Time for action customizing the hello world file with arguments and parameters
- What just happened?
- Using named parameters in jobs
- Have a go hero backing up your work
- Running jobs from a terminal window
- Time for action executing the hello world job from a terminal window
- What just happened?
- Have a go hero experiencing Kitchen
- Using named parameters and command-line arguments in transformations
- Time for action calling the hello world transformation with fixed arguments and parameters
- What just happened?
- Have a go hero saying hello again and again
- Have a go hero loading the time dimension from a job
- Deciding between the use of a command-line argument and a named parameter
- Have a go hero analysing the use of arguments and named parameters
- Running job entries under conditions
- Time for action sending a sales report and warning the administrator if something is wrong
- What just happened?
- Changing the flow of execution on the basis of conditions
- Have a go hero refining the sales report
- Creating and using a file results list
- Have a go hero sharing your work
- Summary
- 11. Creating Advanced Transformations and Jobs
- Enhancing your processes with the use of variables
- Time for action updating a file with news about examinations by setting a variable with the name of the file
- What just happened?
- Setting variables inside a transformation
- Have a go hero enhancing the examination tutorial even more
- Have a go hero enhancing the jigsaw database update process
- Have a go hero executing the proper jigsaw database update process
- Enhancing the design of your processes
- Time for action generating files with top scores
- What just happened?
- Pop quiz using the Add Sequence step
- Reusing part of your transformations
- Time for action calculating the top scores with a subtransformation
- What just happened?
- Creating and using subtransformations
- What just happened?
- Have a go hero refining the subtransformation
- Have a go hero counting words more precisely (second version)
- Creating a job as a process flow
- Time for action splitting the generation of top scores by copying and getting rows
- What just happened?
- Transferring data between transformations by using the copy /get rows mechanism
- What just happened?
- Have a go hero modifying the flow
- Nesting jobs
- Time for action generating the files with top scores by nesting jobs
- What just happened?
- Running a job inside another job with a job entry
- Understanding the scope of variables
- What just happened?
- Pop quiz deciding the scope of variables
- Iterating jobs and transformations
- Time for action generating custom files by executing a transformation for every input row
- What just happened?
- Executing for each row
- Have a go hero processing several files at once
- Have a go hero building lists of products to buy
- Have a go hero e-mail students to let them know how they did
- Summary
- 12. Developing and Implementing a Simple Datamart
- Exploring the sales datamart
- Deciding the level of granularity
- Exploring the sales datamart
- Loading the dimensions
- Time for action loading dimensions for the sales datamart
- What just happened?
- Extending the sales datamart model
- Have a go hero loading the dimensions for the puzzles star model
- Loading a fact table with aggregated data
- Time for action loading the sales fact table by looking up dimensions
- What just happened?
- Getting the information from the source with SQL queries
- Translating the business keys into surrogate keys
- Obtaining the surrogate key for a Type I SCD
- Obtaining the surrogate key for a Type II SCD
- Obtaining the surrogate key for the Junk dimension
- Obtaining the surrogate key for the Time dimension
- Pop quiz modifying a star model and loading the star with PDI
- Have a go hero loading a puzzles fact table
- Getting facts and dimensions together
- Time for action loading the fact table using a range of dates obtained from the command line
- What just happened?
- Time for action loading the sales star
- What just happened?
- Have a go hero enhancing the loading process of the sales fact table
- Have a go hero loading the puzzles sales star
- Have a go hero loading the facts once a month
- Getting rid of administrative tasks
- Time for action automating the loading of the sales datamart
- What just happened?
- Have a go hero Creating a back up of your work automatically
- Have a go hero enhancing the automate process by sending an e-mail if an error occurs
- Summary
- 13. Taking it Further
- PDI best practices
- Getting the most out of PDI
- Extending Kettle with plugins
- Have a go hero listing the top 10 students by using the Head plugin step
- Overcoming real world risks with some remote execution
- Scaling out to overcome bigger risks
- Pop quiz remote execution and clustering
- Integrating PDI and the Pentaho BI suite
- PDI as a process action
- PDI as a datasource
- More about the Pentaho suite
- PDI Enterprise Edition and Kettle Developer Support
- Summary
- A. Working with Repositories
- Creating a repository
- Time for action creating a PDI repository
- What just happened?
- Creating repositories to store your transformationand jobs
- Working with the repository storage system
- Time for action logging into a repository
- What just happened?
- Logging into a repository by using credentials
- Defining repository user accounts
- Creating transformations and jobs in repository folders
- Creating database connections, partitions, servers, and clusters
- Backing up and restoring a repository
- Examining and modifying the contents of a repository with the Repository explorer
- Migrating from a file-based system to a repository-based system and vice-versa
- Summary
- B. Pan and Kitchen: Launching Transformations and Jobs from the Command Line
- Running transformations and jobs stored in files
- Running transformations and jobs from a repository
- Specifying command line options
- Checking the exit code
- Providing options when running Pan and Kitchen
- Log details
- Named parameters
- Arguments
- Variables
- C. Quick Reference: Steps and Job Entries
- Transformation steps
- Job entries
- D. Spoon Shortcuts
- General shortcuts
- Designing transformations and jobs
- Grids
- Repositories
- E. Introducing PDI 4 Features
- Agile BI
- Visual improvements for designing transformations and jobs
- Experiencing the mouse-over assistance
- Time for action creating a hop with the mouse-over assistance
- What just happened?
- Using the mouse-over assistance toolbar
- What just happened?
- Experiencing the sniff-testing feature
- Experiencing the job drill-down feature
- Experiencing even more visual changes
- Enterprise features
- Summary
- F. Pop Quiz Answers
- Chapter 1
- PDI data sources
- PDI prerequisites
- PDI basics
- Chapter 1
- Chapter 2
- formatting data
- Chapter 3
- concatenating strings
- Chapter 4
- data movement (copying and distributing)
- splitting a stream
- Chapter 5
- finding the seven errors
- Chapter 6
- using Kettle variables inside transformations
- Chapter 7
- PDI error handling
- Chapter 8
- defining database connections
- database datatypes versus PDI datatypes
- Insert/Update step versus Table Output/Update steps
- filtering the first 10 rows
- Chapter 9
- loading slowly changing dimensions
- loading type III slowly changing dimensions
- Chapter 10
- defining PDI jobs
- Chapter 11
- using the Add sequence step
- deciding the scope of variables
- Chapter 12
- modifying a star model and loading the star with PDI
- Chapter 13
- remote execution and clustering
- Index
Dzieki opcji "Druk na żądanie" do sprzedaży wracają tytuły Grupy Helion, które cieszyły sie dużym zainteresowaniem, a których nakład został wyprzedany.
Dla naszych Czytelników wydrukowaliśmy dodatkową pulę egzemplarzy w technice druku cyfrowego.
Co powinieneś wiedzieć o usłudze "Druk na żądanie":
- usługa obejmuje tylko widoczną poniżej listę tytułów, którą na bieżąco aktualizujemy;
- cena książki może być wyższa od początkowej ceny detalicznej, co jest spowodowane kosztami druku cyfrowego (wyższymi niż koszty tradycyjnego druku offsetowego). Obowiązująca cena jest zawsze podawana na stronie WWW książki;
- zawartość książki wraz z dodatkami (płyta CD, DVD) odpowiada jej pierwotnemu wydaniu i jest w pełni komplementarna;
- usługa nie obejmuje książek w kolorze.
Masz pytanie o konkretny tytuł? Napisz do nas: sklep[at]helion.pl.
Książka, którą chcesz zamówić pochodzi z końcówki nakładu. Oznacza to, że mogą się pojawić drobne defekty (otarcia, rysy, zagięcia).
Co powinieneś wiedzieć o usłudze "Końcówka nakładu":
- usługa obejmuje tylko książki oznaczone tagiem "Końcówka nakładu";
- wady o których mowa powyżej nie podlegają reklamacji;
Masz pytanie o konkretny tytuł? Napisz do nas: sklep[at]helion.pl.
Książka drukowana
Oceny i opinie klientów: Pentaho 3.2 Data Integration: Beginner's Guide. Explore, transform, validate, and integrate your data with ease María Carina Roldán, Doug Moran (0) Weryfikacja opinii następuję na podstawie historii zamówień na koncie Użytkownika umieszczającego opinię. Użytkownik mógł otrzymać punkty za opublikowanie opinii uprawniające do uzyskania rabatu w ramach Programu Punktowego.