java使用WordNet执行词干分析,io错误:打开的文件太多
我正在使用WordNet对输入文本文件执行词干分析
对于实例数较少的输入,一切正常,但当我增加输入实例数时,程序崩溃,出现以下错误:
Exception in thread "main" java.io.FileNotFoundException: /usr/local/WordNet-3.0/dict/index.adj (Too many open files)
我想我需要关闭一些打开的东西,但我不太确定是什么,我尝试根据this问题实现一个解决方案,尝试关闭正在读取的文件,但没有成功。这让我相信这更多是与WordNet有关,尽管我不能确定
我不太熟悉java io操作
也许这方面的专家能指出我的错误所在
这是我读取输入文件的地方:
public void readFile( String input,
Ontology ontology,
List<Sentence> sentences,
Map<String,List<Integer>> subject2index,
Map<String,List<Integer>> object2index,
Set<String> joints) throws IOException
{
// do what we came here to do, read the input
@SuppressWarnings("resource")
BufferedReader br = new BufferedReader(new FileReader( input ));
//now that we're reading the file we need to send it on
//to the regex extractot after we're through
RegEx regex_extractor = new RegEx();
String line;
while ((line = br.readLine()) != null)
{
regex_extractor.match_regex_patterns(line, ontology, sentences, subject2index, object2index, joints);
}
}
这是WordNet组件:
public WordNet() throws IOException
{
// construct the URL to the Wordnet dictionary directory
wnhome = System.getenv("WNHOME");
path = wnhome + File.separator + "dict";
url = new URL ("file", null , path );
// construct the dictionary object and open it
dict = new Dictionary ( url ) ;
dict.open();
}
public String getStem(String word)
{
WordnetStemmer stem = new WordnetStemmer( dict );
List<String> stemmed_words = stem.findStems(word, POS.VERB);
if ( !stemmed_words.isEmpty() )
return stemmed_words.get(0);
else
return word;
}
# 1 楼答案
“试试这个