You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi!
I'm novice in learning hadoop. Can I get some advice?
I try to do rest-service and one of this functions is creating har-files. I've already done a copying files from local FS to HDFS. For this I use org.apache.hadoop.fs.FileSystem. For creating object of FileSystem I use next expression:
FileSystem fs = FileSystem.get(URI.create(hdfsUri), conf);
In my case hdfsUri = hdfs://10.1.6.23:8020
As I thought from your code is the same:
conf.set("fs.defaultFS", "hdfs://10.1.6.23:8020");
FileSystem fs = FileSystem.get(conf);
And everything works well.
Now I want to create har-file. For this I use org.apache.hadoop.tools.HadoopArchives. For creating object of HadoopArchives I use next expression:
conf.set("fs.defaultFS", "hdfs://10.1.6.23:8020");
HadoopArchives har = new HadoopArchives(conf);
ToolRunner.run(har, params);
Where String[] params = new String[] {
"-archiveName",
"2020-05-27_2.har",
"-p",
"/user/ldr2hdp/1",
"2020-05-27",
"/user/ldr2hdp/_arch/1"
}
In this case I get an error: "Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses."
In your code ArchiveManager.jar in constructor ArchiveManager for creating FileSystem object and HadoopArchives object you use the same conf. In I thing it works well.
Can you help me and say why I get this error.
Thanks.
P.S. My rest works not on hadoop cluster. It works on apllication server.
The text was updated successfully, but these errors were encountered:
Hi!
I'm novice in learning hadoop. Can I get some advice?
I try to do rest-service and one of this functions is creating har-files. I've already done a copying files from local FS to HDFS. For this I use org.apache.hadoop.fs.FileSystem. For creating object of FileSystem I use next expression:
FileSystem fs = FileSystem.get(URI.create(hdfsUri), conf);
In my case hdfsUri = hdfs://10.1.6.23:8020
As I thought from your code is the same:
conf.set("fs.defaultFS", "hdfs://10.1.6.23:8020");
FileSystem fs = FileSystem.get(conf);
And everything works well.
Now I want to create har-file. For this I use org.apache.hadoop.tools.HadoopArchives. For creating object of HadoopArchives I use next expression:
conf.set("fs.defaultFS", "hdfs://10.1.6.23:8020");
HadoopArchives har = new HadoopArchives(conf);
ToolRunner.run(har, params);
Where String[] params = new String[] {
"-archiveName",
"2020-05-27_2.har",
"-p",
"/user/ldr2hdp/1",
"2020-05-27",
"/user/ldr2hdp/_arch/1"
}
In this case I get an error: "Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses."
In your code ArchiveManager.jar in constructor ArchiveManager for creating FileSystem object and HadoopArchives object you use the same conf. In I thing it works well.
Can you help me and say why I get this error.
Thanks.
P.S. My rest works not on hadoop cluster. It works on apllication server.
The text was updated successfully, but these errors were encountered: