版本:hadoop-0.21.0
OS:Unix(Mac OS X)
package cn.com.fri.hadoop;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.net.URI;
import java.util.Date;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FileContext;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.fs.UnsupportedFileSystemException;
import org.apache.hadoop.io.IOUtils;
import org.apache.hadoop.security.AccessControlException;
/**
* 读取文件内容以及文件信息
* @author alex
*
*/
public class MainRead {
public static void main(String[] args) throws AccessControlException, FileNotFoundException, UnsupportedFileSystemException, IOException {
FileContext fc = FileContext.getFileContext(URI.create("hdfs://localhost:9000"));
FSDataInputStream fsInput = fc.open(new Path("test"));
IOUtils.copyBytes(fsInput, System.out, 4090,false);
FileStatus status = fc.getFileStatus(new Path("test"));
System.out.println("--File status---");
System.out.println(new Date(status.getAccessTime()));
System.out.println(status.getPath().toUri().getPath());
double i = 1024.00;
Double len = new Double(status.getLen());
len = len/i;
System.out.println("File length:"+len+"kb");
System.out.println("Block size:"+status.getBlockSize()/1024/1024+"mb");
System.out.println(status.getOwner());
}
}
package cn.com.fri.hadoop;
import java.io.BufferedInputStream;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.InputStream;
import java.net.URI;
import java.util.EnumSet;
import org.apache.hadoop.fs.CreateFlag;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FileAlreadyExistsException;
import org.apache.hadoop.fs.FileContext;
import org.apache.hadoop.fs.ParentNotDirectoryException;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
import org.apache.hadoop.security.AccessControlException;
/**
* 创建一个文件
* @author alex
*
*/
public class MainCreate {
public static void main(String[] args) throws AccessControlException, FileAlreadyExistsException, FileNotFoundException, ParentNotDirectoryException, IOException {
FileContext fc = FileContext.getFileContext(URI.create("hdfs://localhost:9000"));
EnumSet<CreateFlag> es = EnumSet.noneOf(CreateFlag.class);
es.add(CreateFlag.CREATE);
FSDataOutputStream out = fc.create(new Path("test"), es);
InputStream in = new BufferedInputStream(new FileInputStream("/Users/alex/Desktop/persons.rtf"));
IOUtils.copyBytes(in, out, 4090,true);
}
}
在new Path的时候,文件如果加上/则代表放入根路径,否则放入/user/name/路径下,如果hdfs中不存在此路径,则抛出父目录不存在的异常。
分享到:
相关推荐
赠送Maven依赖信息文件:hadoop-mapreduce-client-jobclient-2.6.5.pom; 包含翻译后的API文档:hadoop-mapreduce-client-jobclient-2.6.5-javadoc-API文档-中文(简体)版.zip; Maven坐标:org.apache.hadoop:hadoop...
赠送Maven依赖信息文件:hadoop-yarn-client-2.6.5.pom; 包含翻译后的API文档:hadoop-yarn-client-2.6.5-javadoc-API文档-中文(简体)版.zip; Maven坐标:org.apache.hadoop:hadoop-yarn-client:2.6.5; 标签:...
Eclipse集成Hadoop2.10.0的插件,使用`ant`对hadoop的jar包进行打包并适应Eclipse加载,所以参数里有hadoop和eclipse的目录. 必须注意对于不同的hadoop版本,` HADDOP_INSTALL_PATH/share/hadoop/common/lib`下的jar包...
赠送Maven依赖信息文件:hadoop-mapreduce-client-common-2.6.5.pom; 包含翻译后的API文档:hadoop-mapreduce-client-common-2.6.5-javadoc-API文档-中文(简体)-英语-对照版.zip; Maven坐标:org.apache.hadoop:...
赠送jar包:hadoop-yarn-common-2.6.5.jar 赠送原API文档:hadoop-yarn-common-2.6.5-javadoc.jar 赠送源代码:hadoop-yarn-common-2.6.5-sources.jar 包含翻译后的API文档:hadoop-yarn-common-2.6.5-javadoc-...
hadoop-eclipse-plugin-2.7.3和2.7.7的jar包 hadoop-eclipse-plugin-2.7.3和2.7.7的jar包 hadoop-eclipse-plugin-2.7.3和2.7.7的jar包 hadoop-eclipse-plugin-2.7.3和2.7.7的jar包
hadoop-annotations-3.1.1.jar hadoop-common-3.1.1.jar hadoop-mapreduce-client-core-3.1.1.jar hadoop-yarn-api-3.1.1.jar hadoop-auth-3.1.1.jar hadoop-hdfs-3.1.1.jar hadoop-mapreduce-client-hs-3.1.1.jar ...
赠送Maven依赖信息文件:hadoop-yarn-server-resourcemanager-2.6.0.pom; 包含翻译后的API文档:hadoop-yarn-server-resourcemanager-2.6.0-javadoc-API文档-中文(简体)版.zip; Maven坐标:org.apache.hadoop:...
赠送Maven依赖信息文件:hadoop-mapreduce-client-core-2.5.1.pom; 包含翻译后的API文档:hadoop-mapreduce-client-core-2.5.1-javadoc-API文档-中文(简体)版.zip; Maven坐标:org.apache.hadoop:hadoop-mapreduce...
赠送Maven依赖信息文件:hadoop-yarn-api-2.5.1.pom; 包含翻译后的API文档:hadoop-yarn-api-2.5.1-javadoc-API文档-中文(简体)版.zip; Maven坐标:org.apache.hadoop:hadoop-yarn-api:2.5.1; 标签:apache、...
hadoop-eclipse-plugin-1.2.1hadoop-eclipse-plugin-1.2.1hadoop-eclipse-plugin-1.2.1hadoop-eclipse-plugin-1.2.1
hadoop-eclipse-plugin-3.1.3,eclipse版本为eclipse-jee-2020-03
hadoop-eclipse-plugin-2.7.4.jar和hadoop-eclipse-plugin-2.7.3.jar还有hadoop-eclipse-plugin-2.6.0.jar的插件都在这打包了,都可以用。
赠送jar包:hadoop-hdfs-client-2.9.1.jar 赠送原API文档:hadoop-hdfs-client-2.9.1-javadoc.jar 赠送源代码:hadoop-hdfs-client-2.9.1-sources.jar 包含翻译后的API文档:hadoop-hdfs-client-2.9.1-javadoc-...
hadoop-eclipse-plugin-3.1.1, hadoop eclipse 插件 3.1.1
赠送Maven依赖信息文件:hadoop-auth-2.5.1.pom; 包含翻译后的API文档:hadoop-auth-2.5.1-javadoc-API文档-中文(简体)版.zip; Maven坐标:org.apache.hadoop:hadoop-auth:2.5.1; 标签:apache、auth、hadoop、...
赠送Maven依赖信息文件:hadoop-hdfs-client-2.9.1.pom; 包含翻译后的API文档:hadoop-hdfs-client-2.9.1-javadoc-API文档-中文(简体)-英语-对照版.zip; Maven坐标:org.apache.hadoop:hadoop-hdfs-client:2.9.1;...
hadoop-mapreduce-examples-2.7.1.jar
赠送Maven依赖信息文件:hadoop-mapreduce-client-jobclient-2.6.5.pom; 包含翻译后的API文档:hadoop-mapreduce-client-jobclient-2.6.5-javadoc-API文档-中文(简体)-英语-对照版.zip; Maven坐标:org.apache....
common-2.2.0-bin-master(包含windows端开发Hadoop和Spark需要的winutils.exe),Windows下IDEA开发Hadoop和Spark程序会报错,原因是因为如果本机操作系统是windows,在程序中使用了hadoop相关的东西,比如写入文件到...