我正在解析一个CSV文件。实际上我有这样的代码:
alias NimbleCSV.RFC4180, as: CSV
defmodule Siren do
def parseCSV do
IO.puts("Let's parse CSV file!")
stream = File.stream!("name.csv")
original_line = CSV.parse_stream(stream)
filter_line = Stream.filter(original_line, fn
["JeremyGuthrie" | _] -> true
_ -> false
end)
map = Stream.map(filter_line,
fn [name, team, position, height, weight, age] ->
%{name: name, team: team, position: position,
height: String.to_integer(height),
weight: String.to_integer(weight),
age: Float.parse(age) |> elem(0)
}
end)
end
end
根据我的观点,我建立了一个处理我的name.csv文件的每一行的流。使用NimbleCSV库,我解析这一行,并避免标题行。然后,我过滤每一行,只保留与JeremyGuthrie对应的一行。最后,我将行元素存储到一个结构化的数据Map中。但现在如何只打印我的过滤行的名称:这里是杰里米·格思里。
"我还有一个问题"我有一些问题,过滤我的流根据一个数字,如年龄,身高或体重。
在这里,我用另一个代码来应用阿列克谢的建议:
NimbleCSV.define(MyParser, separator: ";", escape: "\"")
defmodule Siren do
def parseCSV do
IO.puts("Let's parse CSV file!")
"ActeursEOF.csv"
|> File.stream!()
|> MyParser.parse_stream()
|> Stream.filter(fn
["RAZEL BEC" | _] -> true
["" | _] -> false
_ -> false
end)
|> Stream.map(fn [name, description, enr_competences] ->
%{name: name, description: description, enr_competences: enr_competences}
end)
|> Enum.to_list()
|> IO.inspect()
end
end
我的输出:
Compiling 1 file (.ex)
Let's parse CSV file!
[%{description: "Génie Civil", enr_competences: "Oui", name: "RAZEL BEC"}]
但现在结束这个主题,我想访问和股票只是描述,例如。我不知道如何做到这一点...最后显示这些数据。
1条答案
按热度按时间eqoofvh91#
Producing intermediate variables is redundant, in elixir we have
Kernel.|>/2
aka pipe operator to pipe the functions’ output to the first argument of the next function.Note the last line in the chain. Streams are to be terminated to retrieve the result. Until the termination happens, it’s lazily constructed, but not evaluated at all. That makes it possible to e.g. produce and operate infinite streams.
Any greedy function from
Enum
module would do:Enum.take/2
, or, as I pointed out above,Enum.to_list/1
.For the sake of reference, in the future, when you feel fully familiar with elixir , you might use
Flow
instead ofStream
to parallelize mapping. For now (and for relatively small files)Stream
is good enough.