org.nd4j.linalg.factory.Nd4j.getMemoryManager()方法的使用及代码示例

x33g5p2x  于2022-01-24 转载在 其他  
字(12.3k)|赞(0)|评价(0)|浏览(221)

本文整理了Java中org.nd4j.linalg.factory.Nd4j.getMemoryManager()方法的一些代码示例,展示了Nd4j.getMemoryManager()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Nd4j.getMemoryManager()方法的具体详情如下:
包路径:org.nd4j.linalg.factory.Nd4j
类名称:Nd4j
方法名:getMemoryManager

Nd4j.getMemoryManager介绍

[英]This method returns backend-specific MemoryManager implementation, for low-level memory management
[中]此方法返回特定于后端的MemoryManager实现,用于低级内存管理

代码示例

代码示例来源:origin: deeplearning4j/nd4j

  1. /**
  2. * This method notifies given Workspace that new use cycle is starting now
  3. *
  4. * @return
  5. */
  6. @Override
  7. public MemoryWorkspace notifyScopeEntered() {
  8. parentWorkspace = Nd4j.getMemoryManager().getCurrentWorkspace();
  9. Nd4j.getMemoryManager().setCurrentWorkspace(null);
  10. return this;
  11. }

代码示例来源:origin: deeplearning4j/nd4j

  1. @Override
  2. public void close() {
  3. Nd4j.getMemoryManager().setCurrentWorkspace(parentWorkspace);
  4. }

代码示例来源:origin: deeplearning4j/nd4j

  1. /**
  2. * This method TEMPORARY enters this workspace, without reset applied
  3. *
  4. * @return
  5. */
  6. @Override
  7. public MemoryWorkspace notifyScopeBorrowed() {
  8. if (isBorrowed.get())
  9. throw new ND4JIllegalStateException("Workspace [" + id + "]: Can't borrow from borrowed workspace");
  10. borrowingWorkspace = Nd4j.getMemoryManager().getCurrentWorkspace();
  11. isBorrowed.set(true);
  12. Nd4j.getMemoryManager().setCurrentWorkspace(this);
  13. return this;
  14. }

代码示例来源:origin: deeplearning4j/nd4j

  1. public static DataBuffer createBuffer(long[] shape, DataBuffer.Type type) {
  2. long length = ArrayUtil.prodLong(shape);
  3. if (type == DataBuffer.Type.INT)
  4. return Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createInt(length, true) : DATA_BUFFER_FACTORY_INSTANCE.createInt(length, true, Nd4j.getMemoryManager().getCurrentWorkspace());
  5. else if (type == DataBuffer.Type.LONG)
  6. return Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createLong(length, true) : DATA_BUFFER_FACTORY_INSTANCE.createLong(length, true, Nd4j.getMemoryManager().getCurrentWorkspace());
  7. else if (type == DataBuffer.Type.HALF)
  8. return Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createHalf(length, true) : DATA_BUFFER_FACTORY_INSTANCE.createHalf(length, true, Nd4j.getMemoryManager().getCurrentWorkspace());
  9. else if (type == DataBuffer.Type.DOUBLE)
  10. return Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createDouble(length, true) : DATA_BUFFER_FACTORY_INSTANCE.createDouble(length, true, Nd4j.getMemoryManager().getCurrentWorkspace());
  11. else
  12. return Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createFloat(length, true) : DATA_BUFFER_FACTORY_INSTANCE.createFloat(length, true, Nd4j.getMemoryManager().getCurrentWorkspace());
  13. }

代码示例来源:origin: deeplearning4j/nd4j

  1. /**
  2. * Get the feature wise
  3. * range for the statistics.
  4. * Note that this is a lazy getter.
  5. * It is only computed when needed.
  6. * @return the feature wise range
  7. * given the min and max
  8. */
  9. public INDArray getRange() {
  10. if (range == null) {
  11. try (MemoryWorkspace ws = Nd4j.getMemoryManager().scopeOutOfWorkspaces()) {
  12. range = upper.sub(lower);
  13. }
  14. }
  15. return range;
  16. }

代码示例来源:origin: deeplearning4j/nd4j

  1. /**
  2. * This method temporary opens block out of any workspace scope.
  3. * <p>
  4. * PLEASE NOTE: Do not forget to close this block.
  5. *
  6. * @return
  7. */
  8. @Override
  9. public MemoryWorkspace scopeOutOfWorkspaces() {
  10. MemoryWorkspace workspace = Nd4j.getMemoryManager().getCurrentWorkspace();
  11. if (workspace == null)
  12. return new DummyWorkspace();
  13. else {
  14. Nd4j.getMemoryManager().setCurrentWorkspace(null);
  15. return workspace.tagOutOfScopeUse();
  16. }
  17. }

代码示例来源:origin: deeplearning4j/nd4j

  1. /**
  2. * Create a buffer equal of length prod(shape)
  3. *
  4. * @param data the shape of the buffer to create
  5. * @return the created buffer
  6. */
  7. public static DataBuffer createBuffer(int[] data) {
  8. DataBuffer ret;
  9. ret = Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createInt(data) : DATA_BUFFER_FACTORY_INSTANCE.createInt(data, Nd4j.getMemoryManager().getCurrentWorkspace());
  10. logCreationIfNecessary(ret);
  11. return ret;
  12. }

代码示例来源:origin: deeplearning4j/nd4j

  1. @Override
  2. public DataBuffer decompress(DataBuffer buffer) {
  3. CompressedDataBuffer comp = (CompressedDataBuffer) buffer;
  4. DataBuffer result = Nd4j.createBuffer(comp.length(), false);
  5. Nd4j.getMemoryManager().memcpy(result, buffer);
  6. return result;
  7. }

代码示例来源:origin: deeplearning4j/nd4j

  1. /**
  2. * Create a DistributionStats object from the data ingested so far. Can be used multiple times when updating
  3. * online.
  4. */
  5. public MinMaxStats build() {
  6. if (runningLower == null) {
  7. throw new RuntimeException("No data was added, statistics cannot be determined");
  8. }
  9. try (MemoryWorkspace workspace = Nd4j.getMemoryManager().scopeOutOfWorkspaces()) {
  10. return new MinMaxStats(runningLower.dup(), runningUpper.dup());
  11. }
  12. }
  13. }

代码示例来源:origin: deeplearning4j/nd4j

  1. @Override
  2. public void assertCurrentWorkspace(@NonNull T arrayType, String msg) {
  3. validateConfig(arrayType);
  4. MemoryWorkspace curr = Nd4j.getMemoryManager().getCurrentWorkspace();
  5. if(!scopeOutOfWs.contains(arrayType) && (curr == null || !getWorkspaceName(arrayType).equals(curr.getId()))){
  6. throw new ND4JWorkspaceException("Assertion failed: expected current workspace to be \"" + getWorkspaceName(arrayType)
  7. + "\" (for array type " + arrayType + ") - actual current workspace is " + (curr == null ? null : curr.getId())
  8. + (msg == null ? "" : ": " + msg));
  9. };
  10. }

代码示例来源:origin: deeplearning4j/nd4j

  1. /**
  2. * Create a buffer based on the data opType
  3. *
  4. * @param data the data to create the buffer with
  5. * @return the created buffer
  6. */
  7. public static DataBuffer createBuffer(float[] data) {
  8. DataBuffer ret;
  9. if (dataType() == DataBuffer.Type.FLOAT)
  10. ret = Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createFloat(data) : DATA_BUFFER_FACTORY_INSTANCE.createFloat(data, Nd4j.getMemoryManager().getCurrentWorkspace());
  11. else if (dataType() == DataBuffer.Type.HALF)
  12. ret = Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createHalf(data): DATA_BUFFER_FACTORY_INSTANCE.createHalf(data, Nd4j.getMemoryManager().getCurrentWorkspace());
  13. else
  14. ret = Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createDouble(ArrayUtil.toDoubles(data)) : DATA_BUFFER_FACTORY_INSTANCE.createDouble(ArrayUtil.toDoubles(data), Nd4j.getMemoryManager().getCurrentWorkspace()) ;
  15. logCreationIfNecessary(ret);
  16. return ret;
  17. }

代码示例来源:origin: deeplearning4j/nd4j

  1. /**
  2. * Create a buffer based on the data opType
  3. *
  4. * @param data the data to create the buffer with
  5. * @return the created buffer
  6. */
  7. public static DataBuffer createBuffer(float[] data, long offset) {
  8. DataBuffer ret;
  9. if (dataType() == DataBuffer.Type.FLOAT)
  10. ret = Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createFloat(offset, data) : DATA_BUFFER_FACTORY_INSTANCE.createFloat(offset, data, Nd4j.getMemoryManager().getCurrentWorkspace());
  11. else if (dataType() == DataBuffer.Type.HALF)
  12. ret = Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createHalf(offset, data) : DATA_BUFFER_FACTORY_INSTANCE.createHalf(offset, data, Nd4j.getMemoryManager().getCurrentWorkspace());
  13. else
  14. ret = Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createDouble(offset, ArrayUtil.toDoubles(data)) : DATA_BUFFER_FACTORY_INSTANCE.createDouble(offset, ArrayUtil.toDoubles(data), Nd4j.getMemoryManager().getCurrentWorkspace());
  15. logCreationIfNecessary(ret);
  16. return ret;
  17. }

代码示例来源:origin: deeplearning4j/nd4j

  1. /**
  2. * Create a buffer based on the data opType
  3. *
  4. * @param data the data to create the buffer with
  5. * @return the created buffer
  6. */
  7. public static DataBuffer createBuffer(double[] data, long offset) {
  8. DataBuffer ret;
  9. if (dataType() == DataBuffer.Type.DOUBLE)
  10. ret = Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createDouble(offset, data) : DATA_BUFFER_FACTORY_INSTANCE.createDouble(offset, data, Nd4j.getMemoryManager().getCurrentWorkspace());
  11. else if (dataType() == DataBuffer.Type.HALF)
  12. ret = Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createHalf(offset, data) : DATA_BUFFER_FACTORY_INSTANCE.createHalf(offset, ArrayUtil.toFloats(data), Nd4j.getMemoryManager().getCurrentWorkspace());
  13. else
  14. ret = Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createFloat(offset, ArrayUtil.toFloats(data)) : DATA_BUFFER_FACTORY_INSTANCE.createFloat(offset, ArrayUtil.toFloats(data), Nd4j.getMemoryManager().getCurrentWorkspace());
  15. logCreationIfNecessary(ret);
  16. return ret;
  17. }

代码示例来源:origin: deeplearning4j/nd4j

  1. /**
  2. *
  3. * @param length
  4. * @param initialize
  5. * @return
  6. */
  7. public static DataBuffer createBuffer(long length, boolean initialize) {
  8. DataBuffer ret;
  9. if (dataType() == DataBuffer.Type.FLOAT)
  10. ret = Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createFloat(length, initialize) : DATA_BUFFER_FACTORY_INSTANCE.createFloat(length, initialize, Nd4j.getMemoryManager().getCurrentWorkspace());
  11. else if (dataType() == DataBuffer.Type.INT)
  12. ret = DATA_BUFFER_FACTORY_INSTANCE.createInt(length, initialize);
  13. else if (dataType() == DataBuffer.Type.HALF)
  14. ret = Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createHalf(length, initialize) : DATA_BUFFER_FACTORY_INSTANCE.createHalf(length, initialize, Nd4j.getMemoryManager().getCurrentWorkspace());
  15. else
  16. ret = Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createDouble(length, initialize) : DATA_BUFFER_FACTORY_INSTANCE.createDouble(length, initialize, Nd4j.getMemoryManager().getCurrentWorkspace());
  17. logCreationIfNecessary(ret);
  18. return ret;
  19. }

代码示例来源:origin: deeplearning4j/nd4j

  1. /**
  2. * Create a buffer based on the data opType
  3. *
  4. * @param data the data to create the buffer with
  5. * @return the created buffer
  6. */
  7. public static DataBuffer createBuffer(double[] data) {
  8. DataBuffer ret;
  9. if (dataType() == DataBuffer.Type.DOUBLE)
  10. ret = Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createDouble(data) : DATA_BUFFER_FACTORY_INSTANCE.createDouble(data, Nd4j.getMemoryManager().getCurrentWorkspace());
  11. else if (dataType() == DataBuffer.Type.HALF)
  12. ret = Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createHalf(data) : DATA_BUFFER_FACTORY_INSTANCE.createHalf(ArrayUtil.toFloats(data), Nd4j.getMemoryManager().getCurrentWorkspace());
  13. else
  14. ret = Nd4j.getMemoryManager().getCurrentWorkspace() == null ? DATA_BUFFER_FACTORY_INSTANCE.createFloat(ArrayUtil.toFloats(data)) : DATA_BUFFER_FACTORY_INSTANCE.createFloat(ArrayUtil.toFloats(data), Nd4j.getMemoryManager().getCurrentWorkspace());
  15. logCreationIfNecessary(ret);
  16. return ret;
  17. }

代码示例来源:origin: deeplearning4j/nd4j

  1. @Override
  2. public MemoryWorkspace scopeOutOfWorkspaces() {
  3. MemoryWorkspace workspace = Nd4j.getMemoryManager().getCurrentWorkspace();
  4. if (workspace == null)
  5. return new DummyWorkspace();
  6. else {
  7. //Nd4j.getMemoryManager().setCurrentWorkspace(null);
  8. return new DummyWorkspace().notifyScopeEntered();//workspace.tagOutOfScopeUse();
  9. }
  10. }
  11. }

代码示例来源:origin: deeplearning4j/nd4j

  1. /**
  2. * Assert that the specified workspace is open, active, and is the current workspace
  3. *
  4. * @param ws Name of the workspace to assert open/active/current
  5. * @param errorMsg Message to include in the exception, if required
  6. */
  7. public static void assertOpenActiveAndCurrent(@NonNull String ws, @NonNull String errorMsg) throws ND4JWorkspaceException {
  8. if (!Nd4j.getWorkspaceManager().checkIfWorkspaceExistsAndActive(ws)) {
  9. throw new ND4JWorkspaceException(errorMsg + " - workspace is not open and active");
  10. }
  11. MemoryWorkspace currWs = Nd4j.getMemoryManager().getCurrentWorkspace();
  12. if (currWs == null || !ws.equals(currWs.getId())) {
  13. throw new ND4JWorkspaceException(errorMsg + " - not the current workspace (current workspace: "
  14. + (currWs == null ? null : currWs.getId()));
  15. }
  16. }

代码示例来源:origin: deeplearning4j/nd4j

  1. /**
  2. * This method migrates this DataSet into current Workspace (if any)
  3. */
  4. @Override
  5. public void migrate() {
  6. if (Nd4j.getMemoryManager().getCurrentWorkspace() != null) {
  7. if (features != null)
  8. features = features.migrate();
  9. if (labels != null)
  10. labels = labels.migrate();
  11. if (featuresMask != null)
  12. featuresMask = featuresMask.migrate();
  13. if (labelsMask != null)
  14. labelsMask = labelsMask.migrate();
  15. }
  16. }

代码示例来源:origin: deeplearning4j/nd4j

  1. @Override
  2. public DataBuffer compress(DataBuffer buffer) {
  3. CompressionDescriptor descriptor = new CompressionDescriptor(buffer, this);
  4. BytePointer ptr = new BytePointer(buffer.length() * buffer.getElementSize());
  5. CompressedDataBuffer result = new CompressedDataBuffer(ptr, descriptor);
  6. Nd4j.getMemoryManager().memcpy(result, buffer);
  7. return result;
  8. }

代码示例来源:origin: deeplearning4j/nd4j

  1. @Override
  2. public void migrate() {
  3. if (Nd4j.getMemoryManager().getCurrentWorkspace() != null) {
  4. if (features != null)
  5. for (int e = 0; e < features.length; e++)
  6. features[e] = features[e].migrate();
  7. if (labels != null)
  8. for (int e = 0; e < labels.length; e++)
  9. labels[e] = labels[e].migrate();
  10. if (featuresMaskArrays != null)
  11. for (int e = 0; e < featuresMaskArrays.length; e++)
  12. featuresMaskArrays[e] = featuresMaskArrays[e].migrate();
  13. if (labelsMaskArrays != null)
  14. for (int e = 0; e < labelsMaskArrays.length; e++)
  15. labelsMaskArrays[e] = labelsMaskArrays[e].migrate();
  16. }
  17. }

相关文章