Sqoop: Difference between revisions
Tgrosinger (talk | contribs) Expand introduction to include additional abilities |
removed Category:Apache Software Foundation projects; added Category:Apache Attic using HotCat |
||
(55 intermediate revisions by 39 users not shown) | |||
Line 1: | Line 1: | ||
{{Infobox software |
{{Infobox software |
||
| name |
| name = Apache Sqoop |
||
| logo |
| logo = Apache Sqoop logo.svg |
||
| released = {{Start date and age|2009|06|01|df=yes}} <!-- https://blog.cloudera.com/blog/2009/06/introducing-sqoop/ --> |
|||
| screenshot |
| screenshot = |
||
| caption |
| caption = |
||
| developer |
| developer = [[Apache Software Foundation]] |
||
| status = Active |
|||
| discontinued = yes |
|||
| latest release version = 1.4.2 |
|||
| latest release |
| latest release version = 1.4.7 |
||
| latest release date = {{Start date and age|2017|12|06}} |
|||
| latest preview version = |
| latest preview version = |
||
| latest preview date |
| latest preview date = |
||
| operating system |
| operating system = [[Cross-platform]] |
||
| repo = {{URL|https://gitbox.apache.org/repos/asf?p{{=}}sqoop.git|Sqoop Repository}} |
|||
| programming language |
| programming language = [[Java (programming language)|Java]] |
||
| genre |
| genre = [[Data management]] |
||
| license |
| license = [[Apache License 2.0]] |
||
| website |
| website = {{URL|https://sqoop.apache.org}} |
||
}} |
}} |
||
'''Sqoop''' is a [[command-line interface]] application for transferring data between [[relational database]]s and [[Hadoop]].<ref name="mainpage">{{cite web |url=https://sqoop.apache.org|title=Hadoop: Apache Sqoop|access-date=Sep 8, 2012}}</ref> |
|||
The Apache Sqoop project was retired in June 2021 and moved to the Apache Attic.<ref>{{Cite web|title=moving Sqoop to the Attic|url=http://mail-archives.apache.org/mod_mbox/sqoop-user/202106.mbox/browser|access-date=2021-06-27|website=mail-archives.apache.org}}</ref> |
|||
==Description== |
|||
Sqoop supports incremental loads of a single table or a free form [[SQL query]] as well as saved jobs which can be run multiple times to import updates made to a database since the last import. Imports can also be used to populate tables in [[Apache Hive|Hive]] or [[HBase]].<ref>{{cite web |url=https://blogs.apache.org/sqoop/entry/apache_sqoop_overview|title=Apache Sqoop - Overview|access-date=Sep 8, 2012}}</ref> Exports can be used to put data from Hadoop into a relational database. Sqoop got the name from "SQL-to-Hadoop".<ref>{{cite web |url=https://blog.cloudera.com/blog/2009/06/introducing-sqoop/|title=Introducing Sqoop|access-date=Jan 1, 2019}}</ref> |
|||
Sqoop became a top-level [[Apache Software Foundation|Apache]] project in March 2012.<ref>{{cite web |url=https://blogs.apache.org/sqoop/entry/apache_sqoop_graduates_from_incubator|title=Apache Sqoop Graduates from Incubator| |
Sqoop became a top-level [[Apache Software Foundation|Apache]] project in March 2012.<ref>{{cite web |url=https://blogs.apache.org/sqoop/entry/apache_sqoop_graduates_from_incubator|title=Apache Sqoop Graduates from Incubator|access-date=Sep 8, 2012}}</ref> |
||
[[Informatica]] provides a Sqoop-based [[Connector (computer science)|connector]] from version 10.1. |
|||
[[Microsoft]] uses a Sqoop-based connector to help transfer data from [[Microsoft SQL Server]] databases to Hadoop.<ref>{{cite web |url=https://www.microsoft.com/en-us/download/details.aspx?id=27584|title=Microsoft SQL Server Connector for Apache Hadoop| |
[[Pentaho]] provides [[open-source software|open-source]] Sqoop based connector steps, ''Sqoop Import''<ref name="2015-12-10_PSI" /> and ''Sqoop Export'',<ref name="2015-12-10_PSE"/> in their [[Extract, transform, load|ETL]] suite [[Pentaho Data Integration]] since version 4.5 of the software.<ref name="2012-07-27_dbta" /> [[Microsoft]] uses a Sqoop-based connector to help transfer data from [[Microsoft SQL Server]] databases to Hadoop.<ref>{{cite web |url=https://www.microsoft.com/en-us/download/details.aspx?id=27584|title=Microsoft SQL Server Connector for Apache Hadoop|website=[[Microsoft]] |access-date=Sep 8, 2012}}</ref> |
||
[[Couchbase, Inc.]] also provides a [[Couchbase Server]]-Hadoop connector by means of Sqoop.<ref>{{cite web |
[[Couchbase, Inc.]] also provides a [[Couchbase Server]]-Hadoop connector by means of Sqoop.<ref>{{cite web|url=http://www.couchbase.com/develop/connectors/hadoop|title=Couchbase Hadoop Connector|access-date=Sep 8, 2012|url-status=dead|archive-url=https://web.archive.org/web/20120825184036/http://www.couchbase.com/develop/connectors/hadoop|archive-date=2012-08-25}}</ref> |
||
==See |
==See also== |
||
* |
*[[Apache Hadoop]] |
||
* |
*[[Apache Hive]] |
||
⚫ | |||
*[[Apache HBase]] |
|||
==References== |
==References== |
||
{{Reflist|refs= |
|||
{{reflist}} |
|||
<ref name="2012-07-27_dbta">{{cite web |
|||
| url = http://www.dbta.com/Editorial/News-Flashes/Big-Data-Analytics-Vendor-Pentaho-Announces-Tighter-Integration-with-Cloudera-Extends-Visual-Interface-to-Include-Hadoop-Sqoop-and-Oozie-84025.aspx |
|||
| title = Big Data Analytics Vendor Pentaho Announces Tighter Integration with Cloudera; Extends Visual Interface to Include Hadoop Sqoop and Oozie |
|||
| publisher = [[Database Trends and Applications]] (dbta.com) |
|||
| date = 2012-07-27 |
|||
| access-date = 2015-12-08 |
|||
| archive-url = https://web.archive.org/web/20151208144234/http://www.dbta.com/Editorial/News-Flashes/Big-Data-Analytics-Vendor-Pentaho-Announces-Tighter-Integration-with-Cloudera-Extends-Visual-Interface-to-Include-Hadoop-Sqoop-and-Oozie-84025.aspx |
|||
| archive-date = 2015-12-08 |
|||
| quote = Pentaho’s Business Analytics 4.5 is now certified on Cloudera’s latest releases, Cloudera Enterprise 4.0 and CDH4. Pentaho also announced that its visual design studio capabilities have been extended to the Sqoop and Oozie components of Hadoop. |
|||
}}</ref> |
|||
<ref name="2015-12-10_PSE">{{cite web |
|||
| url = http://wiki.pentaho.com/display/EAI/Sqoop+Export |
|||
| title = Sqoop Export |
|||
| publisher = [[Pentaho]] |
|||
| date = 2015-12-10 |
|||
| access-date = 2015-12-10 |
|||
| archive-url = https://web.archive.org/web/20151210171525/http://wiki.pentaho.com/display/EAI/Sqoop+Export |
|||
| archive-date = 2015-12-10 |
|||
| quote = The Sqoop Export job allows you to export data from Hadoop into an RDBMS using Apache Sqoop. |
|||
}}</ref> |
|||
<ref name="2015-12-10_PSI">{{cite web |
|||
| url = http://wiki.pentaho.com/display/EAI/Sqoop+Import |
|||
| title = Sqoop Import |
|||
| publisher = [[Pentaho]] |
|||
| date = 2015-12-10 |
|||
| access-date = 2015-12-10 |
|||
| archive-url = https://web.archive.org/web/20151210170913/http://wiki.pentaho.com/display/EAI/Sqoop+Import |
|||
| archive-date = 2015-12-10 |
|||
| quote = The Sqoop Import job allows you to import data from a relational database into the Hadoop Distributed File System (HDFS) using Apache Sqoop. |
|||
}}</ref> |
|||
}} |
|||
==Bibliography== |
==Bibliography== |
||
{{Refbegin}} |
{{Refbegin}} |
||
*{{Cite book |
*{{Cite book |first1 = Tom |
||
| |
|last1 = White |
||
⚫ | |||
| last1 = White |
|||
⚫ | |||
⚫ | |||
⚫ | |||
⚫ | |||
|year = 2010 |
|||
⚫ | |||
| |
|publisher = [[O'Reilly Media]] |
||
|pages = [https://archive.org/details/hadoopdefinitive0000whit/page/477 477–495] |
|||
| page = 477–495 |
|||
| |
|isbn = 978-1-449-38973-4 |
||
|chapter-url = https://archive.org/details/hadoopdefinitive0000whit/page/477 |
|||
| url = http://oreilly.com/catalog/9780596521974 |
|||
}} |
}} |
||
{{Refend}} |
{{Refend}} |
||
==External links== |
==External links== |
||
* |
*{{Official website|https://sqoop.apache.org}} |
||
*[https://cwiki.apache.org/confluence/display/SQOOP/Home Sqoop Wiki] |
*[https://cwiki.apache.org/confluence/display/SQOOP/Home Sqoop Wiki] |
||
*[https://web.archive.org/web/20140202154003/http://qnalist.com/q/sqoop-user Sqoop Users Mailing List Archives] |
|||
{{Apache Software Foundation}} |
|||
[[Category:Apache Attic|Sqoop]] |
|||
{{Apache}} |
|||
[[Category:Cloud |
[[Category:Cloud applications]] |
||
[[Category:Hadoop]] |
[[Category:Hadoop]] |
||
⚫ |
Latest revision as of 19:04, 17 July 2024
Developer(s) | Apache Software Foundation |
---|---|
Initial release | 1 June 2009 |
Final release | 1.4.7
/ December 6, 2017 |
Repository | Sqoop Repository |
Written in | Java |
Operating system | Cross-platform |
Type | Data management |
License | Apache License 2.0 |
Website | sqoop |
Sqoop is a command-line interface application for transferring data between relational databases and Hadoop.[1]
The Apache Sqoop project was retired in June 2021 and moved to the Apache Attic.[2]
Description
[edit]Sqoop supports incremental loads of a single table or a free form SQL query as well as saved jobs which can be run multiple times to import updates made to a database since the last import. Imports can also be used to populate tables in Hive or HBase.[3] Exports can be used to put data from Hadoop into a relational database. Sqoop got the name from "SQL-to-Hadoop".[4] Sqoop became a top-level Apache project in March 2012.[5]
Informatica provides a Sqoop-based connector from version 10.1. Pentaho provides open-source Sqoop based connector steps, Sqoop Import[6] and Sqoop Export,[7] in their ETL suite Pentaho Data Integration since version 4.5 of the software.[8] Microsoft uses a Sqoop-based connector to help transfer data from Microsoft SQL Server databases to Hadoop.[9] Couchbase, Inc. also provides a Couchbase Server-Hadoop connector by means of Sqoop.[10]
See also
[edit]References
[edit]- ^ "Hadoop: Apache Sqoop". Retrieved Sep 8, 2012.
- ^ "moving Sqoop to the Attic". mail-archives.apache.org. Retrieved 2021-06-27.
- ^ "Apache Sqoop - Overview". Retrieved Sep 8, 2012.
- ^ "Introducing Sqoop". Retrieved Jan 1, 2019.
- ^ "Apache Sqoop Graduates from Incubator". Retrieved Sep 8, 2012.
- ^ "Sqoop Import". Pentaho. 2015-12-10. Archived from the original on 2015-12-10. Retrieved 2015-12-10.
The Sqoop Import job allows you to import data from a relational database into the Hadoop Distributed File System (HDFS) using Apache Sqoop.
- ^ "Sqoop Export". Pentaho. 2015-12-10. Archived from the original on 2015-12-10. Retrieved 2015-12-10.
The Sqoop Export job allows you to export data from Hadoop into an RDBMS using Apache Sqoop.
- ^ "Big Data Analytics Vendor Pentaho Announces Tighter Integration with Cloudera; Extends Visual Interface to Include Hadoop Sqoop and Oozie". Database Trends and Applications (dbta.com). 2012-07-27. Archived from the original on 2015-12-08. Retrieved 2015-12-08.
Pentaho's Business Analytics 4.5 is now certified on Cloudera's latest releases, Cloudera Enterprise 4.0 and CDH4. Pentaho also announced that its visual design studio capabilities have been extended to the Sqoop and Oozie components of Hadoop.
- ^ "Microsoft SQL Server Connector for Apache Hadoop". Microsoft. Retrieved Sep 8, 2012.
- ^ "Couchbase Hadoop Connector". Archived from the original on 2012-08-25. Retrieved Sep 8, 2012.
Bibliography
[edit]- White, Tom (2010). "Chapter 15: Sqoop". Hadoop: The Definitive Guide (2nd ed.). O'Reilly Media. pp. 477–495. ISBN 978-1-449-38973-4.