You have data in large files you need to copy from your Hadoop HDFS to other storage providers. You decided to use the Oracle Big Data Cloud Service distributed copy utility odcp.
As odcp is compatible with Cloudera Distribution Including Apache Hadoop, which four are supported when copying files?
ABC Media receives thousands of files every day from many sources. Each text-formatted file is typically 1-2 MB in size. They need to store all these files for at least two years. They heard about Hadoop and about the HDFS filesystem, and want to take advantage of the cost-effective storage to store the vast number of files.
Which two recommendations could you provide to the customer to maintain the effectiveness of HDFS with the growing number of files?
Consider breaking down files into smaller files before ingesting.
Consider adding additional Name Nodes to increase data storage capacity.
Reduce the memory available for namenode as 1-2 MB files don’t need a lot of memory.
Consider concatenating files after ingesting.
Use compression to free up space.
Correct answer: AE
Question 3
During provisioning, which can you create in order to integrate Big Data Cloud with Other Oracle PaaS services?