@Platform(value={"linux-x86_64","macosx-x86_64"}, extension="-gpu") @Name(value="caffe::CuDNNReLULayer<float>") @NoOffset @Properties(inherit=caffe.class) public class FloatCuDNNReLULayer extends FloatReLULayer
Pointer.CustomDeallocator, Pointer.Deallocator, Pointer.NativeDeallocator, Pointer.ReferenceCounter
Constructor and Description |
---|
FloatCuDNNReLULayer(LayerParameter param) |
FloatCuDNNReLULayer(Pointer p)
Pointer cast constructor.
|
Modifier and Type | Method and Description |
---|---|
protected void |
Backward_gpu(FloatBlobVector top,
BoolVector propagate_down,
FloatBlobVector bottom) |
protected void |
Forward_gpu(FloatBlobVector bottom,
FloatBlobVector top) |
void |
LayerSetUp(FloatBlobVector bottom,
FloatBlobVector top)
\brief Does layer-specific setup: your layer should implement this function
as well as Reshape.
|
void |
Reshape(FloatBlobVector bottom,
FloatBlobVector top)
\brief Adjust the shapes of top blobs and internal buffers to accommodate
the shapes of the bottom blobs.
|
Backward_cpu, Forward_cpu, type
ExactNumBottomBlobs, ExactNumTopBlobs
AllowForceBackward, AutoTopBlobs, Backward, blobs, CheckBlobCounts, EqualNumBottomTopBlobs, Forward, layer_param, loss, MaxBottomBlobs, MaxTopBlobs, MinBottomBlobs, MinTopBlobs, param_propagate_down, set_loss, set_param_propagate_down, SetUp, ToProto
address, asBuffer, asByteBuffer, availablePhysicalBytes, calloc, capacity, capacity, close, deallocate, deallocate, deallocateReferences, deallocator, deallocator, equals, fill, formatBytes, free, getDirectBufferAddress, getPointer, getPointer, getPointer, getPointer, hashCode, interruptDeallocatorThread, isNull, isNull, limit, limit, malloc, maxBytes, maxPhysicalBytes, memchr, memcmp, memcpy, memmove, memset, offsetAddress, offsetof, offsetof, parseBytes, physicalBytes, physicalBytesInaccurate, position, position, put, realloc, referenceCount, releaseReference, retainReference, setNull, sizeof, sizeof, toString, totalBytes, totalCount, totalPhysicalBytes, withDeallocator, zero
public FloatCuDNNReLULayer(Pointer p)
Pointer(Pointer)
.public FloatCuDNNReLULayer(@Const @ByRef LayerParameter param)
@Virtual public void LayerSetUp(@Const @ByRef FloatBlobVector bottom, @Const @ByRef FloatBlobVector top)
FloatLayer
LayerSetUp
in class FloatLayer
bottom
- the preshaped input blobs, whose data fields store the input data for
this layertop
- the allocated but unshaped output blobs
This method should do one-time layer specific setup. This includes reading
and processing relevent parameters from the layer_param_
.
Setting up the shapes of top blobs and internal buffers should be done in
Reshape
, which will be called before the forward pass to
adjust the top blob sizes.@Virtual public void Reshape(@Const @ByRef FloatBlobVector bottom, @Const @ByRef FloatBlobVector top)
FloatLayer
Reshape
in class FloatNeuronLayer
bottom
- the input blobs, with the requested input shapestop
- the top blobs, which should be reshaped as needed
This method should reshape top blobs as needed according to the shapes
of the bottom (input) blobs, as well as reshaping any internal buffers
and making any other necessary adjustments so that the layer can
accommodate the bottom blobs.@Virtual protected void Forward_gpu(@Const @ByRef FloatBlobVector bottom, @Const @ByRef FloatBlobVector top)
Forward_gpu
in class FloatReLULayer
@Virtual protected void Backward_gpu(@Const @ByRef FloatBlobVector top, @Const @ByRef BoolVector propagate_down, @Const @ByRef FloatBlobVector bottom)
Backward_gpu
in class FloatReLULayer
Copyright © 2022. All rights reserved.