本文整理了Java中org.nd4j.linalg.factory.Nd4j.valueArrayOf()
方法的一些代码示例,展示了Nd4j.valueArrayOf()
的具体用法。这些代码示例主要来源于Github
/Stackoverflow
/Maven
等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Nd4j.valueArrayOf()
方法的具体详情如下:
包路径:org.nd4j.linalg.factory.Nd4j
类名称:Nd4j
方法名:valueArrayOf
[英]Creates a row vector ndarray with the specified value as the only value in the ndarray Some people may know this as np.full
[中]创建一个行向量ndarray,其中指定的值是ndarray中的唯一值。有些人可能知道这是np。满的
代码示例来源:origin: deeplearning4j/nd4j
@Override
public INDArray doCreate(long[] shape, INDArray paramsView) {
return Nd4j.valueArrayOf(shape,constant);
}
代码示例来源:origin: deeplearning4j/nd4j
/**
* Returns an ndarray with 1 if the element is epsilon equals
*
* @param other the number to compare
* @return a ndarray with the given
* binary conditions
*/
@Override
public INDArray epsi(Number other) {
INDArray otherArr = Nd4j.valueArrayOf(shape(), other.doubleValue());
return epsi(otherArr);
}
代码示例来源:origin: deeplearning4j/nd4j
/**
* Prepare the boundaries for processing
* @param bounds the bounds
* @param x the input in to the approximation
* @return the lower and upper bounds as an array of ndarrays
* (in that order) of the same shape as x
*/
public static INDArray[] prepareBounds(INDArray bounds,INDArray x) {
return new INDArray[] {Nd4j.valueArrayOf(x.shape(),bounds.getDouble(0)),
Nd4j.valueArrayOf(x.shape(),bounds.getDouble(1))};
}
代码示例来源:origin: deeplearning4j/nd4j
/**
* Append the given
* array with the specified value size
* along a particular axis
* @param arr the array to append to
* @param padAmount the pad amount of the array to be returned
* @param val the value to append
* @param axis the axis to append to
* @return the newly created array
*/
public static INDArray prepend(INDArray arr, int padAmount, double val, int axis) {
if (padAmount == 0)
return arr;
long[] paShape = ArrayUtil.copy(arr.shape());
if (axis < 0)
axis = axis + arr.shape().length;
paShape[axis] = padAmount;
INDArray concatArr = Nd4j.valueArrayOf(paShape, val);
return Nd4j.concat(axis, concatArr, arr);
}
代码示例来源:origin: deeplearning4j/dl4j-examples
System.out.println(allOnes);
INDArray allTens = Nd4j.valueArrayOf(nRows, nColumns, 10.0);
System.out.println("\nNd4j.valueArrayOf(nRows, nColumns, 10.0)");
System.out.println(allTens);
代码示例来源:origin: deeplearning4j/nd4j
/**
* Append the given
* array with the specified value size
* along a particular axis
* @param arr the array to append to
* @param padAmount the pad amount of the array to be returned
* @param val the value to append
* @param axis the axis to append to
* @return the newly created array
*/
public static INDArray append(INDArray arr, int padAmount, double val, int axis) {
if (padAmount == 0)
return arr;
long[] paShape = ArrayUtil.copy(arr.shape());
if (axis < 0)
axis = axis + arr.shape().length;
paShape[axis] = padAmount;
INDArray concatArray = Nd4j.valueArrayOf(paShape, val);
return Nd4j.concat(axis, arr, concatArray);
}
代码示例来源:origin: deeplearning4j/dl4j-examples
public static void main(String[] args){
int nRows = 3;
int nCols = 5;
long rngSeed = 12345;
//Generate random numbers between -1 and +1
INDArray random = Nd4j.rand(nRows, nCols, rngSeed).muli(2).subi(1);
System.out.println("Array values:");
System.out.println(random);
//For example, we can conditionally replace values less than 0.0 with 0.0:
INDArray randomCopy = random.dup();
BooleanIndexing.replaceWhere(randomCopy, 0.0, Conditions.lessThan(0.0));
System.out.println("After conditionally replacing negative values:\n" + randomCopy);
//Or conditionally replace NaN values:
INDArray hasNaNs = Nd4j.create(new double[]{1.0,1.0,Double.NaN,1.0});
BooleanIndexing.replaceWhere(hasNaNs,0.0, Conditions.isNan());
System.out.println("hasNaNs after replacing NaNs with 0.0:\n" + hasNaNs);
//Or we can conditionally copy values from one array to another:
randomCopy = random.dup();
INDArray tens = Nd4j.valueArrayOf(nRows, nCols, 10.0);
BooleanIndexing.replaceWhere(randomCopy, tens, Conditions.lessThan(0.0));
System.out.println("Conditionally copying values from array 'tens', if original value is less than 0.0\n" + randomCopy);
//One simple task is to count the number of values that match the condition
MatchCondition op = new MatchCondition(random, Conditions.greaterThan(0.0));
int countGreaterThanZero = Nd4j.getExecutioner().exec(op,Integer.MAX_VALUE).getInt(0); //MAX_VALUE = "along all dimensions" or equivalently "for entire array"
System.out.println("Number of values matching condition 'greater than 0': " + countGreaterThanZero);
}
代码示例来源:origin: deeplearning4j/nd4j
/**
* A getter for the allocated ndarray
* with this {@link SDVariable}.
*
* This getter will lazy initialize an array if one is not found
* based on the associated shape and {@link WeightInitScheme}
* if neither are found, an {@link ND4JIllegalStateException}
* is thrown.
*
* If a {@link DifferentialFunction} is defined, note that
* its getArr() method is called instead.
* @return the {@link INDArray} associated with this variable.
*/
public INDArray getArr() {
if(sameDiff.arrayAlreadyExistsForVarName(getVarName()))
return sameDiff.getArrForVarName(getVarName());
//initialize value if it's actually a scalar constant (zero or 1 typically...)
if(getScalarValue() != null && ArrayUtil.prod(getShape()) == 1) {
INDArray arr = Nd4j.valueArrayOf(getShape(),
getScalarValue().doubleValue());
sameDiff.associateArrayWithVariable(arr,this);
}
else if(sameDiff.getShapeForVarName(getVarName()) == null)
return null;
else {
INDArray newAlloc = getWeightInitScheme().create(sameDiff.getShapeForVarName(getVarName()));
sameDiff.associateArrayWithVariable(newAlloc,this);
}
return sameDiff.getArrForVarName(getVarName());
}
代码示例来源:origin: deeplearning4j/nd4j
arrayShape = new int[]{};
INDArray array = Nd4j.valueArrayOf(arrayShape, (double) val);
return array;
} else if (tfTensor.getInt64ValCount() > 0) {
arrayShape = new int[]{};
INDArray array = Nd4j.valueArrayOf(arrayShape, (double) val);
return array;
} else if (tfTensor.getFloatValCount() > 0) {
代码示例来源:origin: deeplearning4j/dl4j-examples
print("ARange", stepOfThreeTillTen);
INDArray allEights = Nd4j.valueArrayOf(new int[] {2,3}, 8);
print("2x3 Eights", allEights);
print("Concatenated arrays on dimension 1", concatenatedAxisOne);
INDArray [] verticalSplit = CustomOperations.split(Nd4j.valueArrayOf(new int[] {9, 9}, 9),
3);
print("Vertical Split", verticalSplit);
INDArray [] horizontalSplit = CustomOperations.hsplit(Nd4j.valueArrayOf(new int[]{10, 10}, 10),
5);
print("Horizontal Split", horizontalSplit);
代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn
protected INDArray createBias(int nOut, double biasInit, INDArray biasParamView, boolean initializeParameters) {
if (initializeParameters) {
INDArray ret = Nd4j.valueArrayOf(nOut, biasInit);
biasParamView.assign(ret);
}
return biasParamView;
}
代码示例来源:origin: improbable-research/keanu
public static INDArray valueArrayOf(long[] shape, double value, DataBuffer.Type bufferType) {
Nd4j.setDataType(bufferType);
switch (shape.length) {
case 0:
return scalar(value, bufferType);
case 1:
return reshapeToVector(Nd4j.valueArrayOf(shape, value));
default:
return Nd4j.valueArrayOf(shape, value);
}
}
代码示例来源:origin: org.nd4j/nd4j-api
/**
* Returns an ndarray with 1 if the element is epsilon equals
*
* @param other the number to compare
* @return a ndarray with the given
* binary conditions
*/
@Override
public INDArray epsi(Number other) {
INDArray otherArr = Nd4j.valueArrayOf(shape(), other.doubleValue());
return epsi(otherArr);
}
代码示例来源:origin: improbable-research/keanu
private static INDArray performOperationWithScalarTensorPreservingShape(INDArray left, INDArray right, BiFunction<INDArray, INDArray, INDArray> operation) {
if (left.length() == 1 || right.length() == 1) {
long[] resultShape = Shape.broadcastOutputShape(left.shape(), right.shape());
INDArray result = (left.length() == 1) ?
operation.apply(Nd4j.valueArrayOf(right.shape(), left.getDouble(0)), right) :
operation.apply(left, Nd4j.valueArrayOf(left.shape(), right.getDouble(0)));
return result.reshape(resultShape);
} else {
return operation.apply(left, right);
}
}
代码示例来源:origin: org.deeplearning4j/deeplearning4j-nn
protected INDArray createVisibleBias(NeuralNetConfiguration conf, INDArray visibleBiasView,
boolean initializeParameters) {
org.deeplearning4j.nn.conf.layers.BasePretrainNetwork layerConf =
(org.deeplearning4j.nn.conf.layers.BasePretrainNetwork) conf.getLayer();
if (initializeParameters) {
INDArray ret = Nd4j.valueArrayOf(layerConf.getNIn(), layerConf.getVisibleBiasInit());
visibleBiasView.assign(ret);
}
return visibleBiasView;
}
代码示例来源:origin: org.nd4j/nd4j-api
/**
* Append the given
* array with the specified value size
* along a particular axis
* @param arr the array to append to
* @param padAmount the pad amount of the array to be returned
* @param val the value to append
* @param axis the axis to append to
* @return the newly created array
*/
public static INDArray append(INDArray arr, int padAmount, double val, int axis) {
if (padAmount == 0)
return arr;
int[] paShape = ArrayUtil.copy(arr.shape());
if (axis < 0)
axis = axis + arr.shape().length;
paShape[axis] = padAmount;
INDArray concatArray = Nd4j.valueArrayOf(paShape, val);
return Nd4j.concat(axis, arr, concatArray);
}
代码示例来源:origin: org.nd4j/nd4j-api
/**
* Append the given
* array with the specified value size
* along a particular axis
* @param arr the array to append to
* @param padAmount the pad amount of the array to be returned
* @param val the value to append
* @param axis the axis to append to
* @return the newly created array
*/
public static INDArray prepend(INDArray arr, int padAmount, double val, int axis) {
if (padAmount == 0)
return arr;
int[] paShape = ArrayUtil.copy(arr.shape());
if (axis < 0)
axis = axis + arr.shape().length;
paShape[axis] = padAmount;
INDArray concatArr = Nd4j.valueArrayOf(paShape, val);
return Nd4j.concat(axis, concatArr, arr);
}
代码示例来源:origin: org.nd4j/nd4j-cuda-10.0
ret = Nd4j.zeros(retShape);
} else {
ret = Nd4j.valueArrayOf(retShape, op.zeroDouble());
代码示例来源:origin: org.nd4j/nd4j-cuda-7.5
protected void buildZ(IndexAccumulation op, int... dimension) {
Arrays.sort(dimension);
for (int i = 0; i < dimension.length; i++) {
if (dimension[i] < 0)
dimension[i] += op.x().rank();
}
//do op along all dimensions
if (dimension.length == op.x().rank())
dimension = new int[] {Integer.MAX_VALUE};
int[] retShape = Shape.wholeArrayDimension(dimension) ? new int[] {1, 1}
: ArrayUtil.removeIndex(op.x().shape(), dimension);
//ensure vector is proper shape
if (retShape.length == 1) {
if (dimension[0] == 0)
retShape = new int[] {1, retShape[0]};
else
retShape = new int[] {retShape[0], 1};
} else if (retShape.length == 0) {
retShape = new int[] {1, 1};
}
INDArray ret = null;
if (Math.abs(op.zeroDouble()) < Nd4j.EPS_THRESHOLD) {
ret = Nd4j.zeros(retShape);
} else {
ret = Nd4j.valueArrayOf(retShape, op.zeroDouble());
}
op.setZ(ret);
}
代码示例来源:origin: improbable-research/keanu
private static INDArray executeNd4jTransformOpWithPreservedScalarTensorShape(INDArray mask, INDArray right, DataBuffer.Type bufferType, QuadFunction<INDArray, INDArray, INDArray, Long, BaseTransformOp> baseTransformOpConstructor) {
if (mask.length() == 1 || right.length() == 1) {
long[] resultShape = Shape.broadcastOutputShape(mask.shape(), right.shape());
if (mask.length() == 1) {
mask = Nd4j.valueArrayOf(right.shape(), mask.getDouble(0));
Nd4j.getExecutioner().exec(
baseTransformOpConstructor.apply(mask, right, mask, mask.length())
);
} else {
Nd4j.getExecutioner().exec(
baseTransformOpConstructor.apply(mask,
valueArrayOf(mask.shape(), right.getDouble(0), bufferType),
mask,
mask.length()
)
);
}
return mask.reshape(resultShape);
} else {
Nd4j.getExecutioner().exec(
baseTransformOpConstructor.apply(mask, right, mask, mask.length())
);
return mask;
}
}
内容来源于网络,如有侵权,请联系作者删除!