Package | Description |
---|---|
org.bytedeco.opencv.opencv_dnn |
Modifier and Type | Class and Description |
---|---|
class |
AbsLayer |
class |
AccumLayer |
class |
AcoshLayer |
class |
AcosLayer |
class |
ActivationLayer |
class |
ActivationLayerInt8 |
class |
ArgLayer
\brief ArgMax/ArgMin layer
\note returns indices as floats, which means the supported range is [-2^24; 2^24]
|
class |
AsinhLayer |
class |
AsinLayer |
class |
AtanhLayer |
class |
AtanLayer |
class |
AttentionLayer |
class |
BaseConvolutionLayer |
class |
BatchNormLayer |
class |
BatchNormLayerInt8 |
class |
BlankLayer
\addtogroup dnn
\{
|
class |
BNLLLayer |
class |
CeilLayer |
class |
CeluLayer |
class |
ChannelsPReLULayer |
class |
CompareLayer |
class |
ConcatLayer |
class |
ConstLayer
Constant layer produces the same data blob at an every forward pass.
|
class |
ConvolutionLayer |
class |
ConvolutionLayerInt8 |
class |
CorrelationLayer |
class |
CoshLayer |
class |
CosLayer |
class |
CropAndResizeLayer |
class |
CropLayer |
class |
CumSumLayer |
class |
DataAugmentationLayer |
class |
DeconvolutionLayer |
class |
DequantizeLayer |
class |
DetectionOutputLayer
\brief Detection output layer.
|
class |
EinsumLayer
\brief This function performs array summation based
on the Einstein summation convention.
|
class |
EltwiseLayer
\brief Element wise operation on inputs
|
class |
EltwiseLayerInt8 |
class |
ELULayer |
class |
ErfLayer |
class |
ExpandLayer |
class |
ExpLayer |
class |
FlattenLayer |
class |
FloorLayer |
class |
FlowWarpLayer |
class |
GatherElementsLayer
\brief GatherElements layer
GatherElements takes two inputs data and indices of the same rank r >= 1 and an optional attribute axis and works such that:
output[i][j][k] = data[index[i][j][k]][j][k] if axis = 0 and r = 3
output[i][j][k] = data[i][index[i][j][k]][k] if axis = 1 and r = 3
output[i][j][k] = data[i][j][index[i][j][k]] if axis = 2 and r = 3
Gather, on the other hand, takes a data tensor of rank r >= 1, and indices tensor of rank q, and works such that:
it gathers the enteries along axis dimension of the input data indexed by indices and concatenates them in an output tensor of rank q + (r - 1)
e.g.
|
class |
GatherLayer
\brief Gather layer
|
class |
GeluApproximationLayer |
class |
GeluLayer |
class |
GemmLayer |
class |
GRULayer
\brief GRU recurrent one-layer
Accepts input sequence and computes the final hidden state for each element in the batch.
|
class |
HardSigmoidLayer |
class |
HardSwishLayer |
class |
InnerProductLayer
InnerProduct , MatMul and Gemm operations are all implemented by Fully Connected Layer. |
class |
InnerProductLayerInt8 |
class |
InstanceNormLayer |
class |
InterpLayer
\brief Bilinear resize layer from https://github.com/cdmh/deeplab-public-ver2
It differs from \ref ResizeLayer in output shape and resize scales computations.
|
class |
LayerNormLayer |
class |
LogLayer |
class |
LRNLayer |
class |
LSTMLayer
LSTM recurrent layer
|
class |
MatMulLayer |
class |
MaxUnpoolLayer |
class |
MishLayer |
class |
MVNLayer |
class |
NaryEltwiseLayer |
class |
NormalizeBBoxLayer
\brief
L_p - normalization layer. |
class |
NotLayer |
class |
PaddingLayer
\brief Adds extra values for specific axes.
|
class |
PermuteLayer |
class |
PoolingLayer |
class |
PoolingLayerInt8 |
class |
PowerLayer |
class |
PriorBoxLayer |
class |
ProposalLayer |
class |
QuantizeLayer |
class |
ReciprocalLayer |
class |
ReduceLayer |
class |
RegionLayer |
class |
ReLU6Layer |
class |
ReLULayer |
class |
ReorgLayer |
class |
RequantizeLayer |
class |
ReshapeLayer |
class |
ResizeLayer
\brief Resize input 4-dimensional blob by nearest neighbor or bilinear strategy.
|
class |
RNNLayer
\brief Classical recurrent layer
|
class |
RoundLayer |
class |
ScaleLayer |
class |
ScaleLayerInt8 |
class |
ScatterLayer |
class |
ScatterNDLayer |
class |
SeluLayer |
class |
ShiftLayer |
class |
ShiftLayerInt8 |
class |
ShrinkLayer |
class |
ShuffleChannelLayer
Permute channels of 4-dimensional input blob.
|
class |
SigmoidLayer |
class |
SignLayer |
class |
SinhLayer |
class |
SinLayer |
class |
SliceLayer
Slice layer has several modes:
1.
|
class |
SoftmaxLayer |
class |
SoftmaxLayerInt8 |
class |
SoftplusLayer |
class |
SoftsignLayer |
class |
SplitLayer |
class |
SqrtLayer |
class |
SwishLayer |
class |
TanHLayer |
class |
TanLayer |
class |
ThresholdedReluLayer |
class |
TileLayer |
Modifier and Type | Method and Description |
---|---|
Layer |
Layer.blobs(MatVector setter) |
Layer |
Layer.blobs(MatVector setter) |
Layer |
LayerFactory.Constructor.call(LayerParams params) |
Layer |
LayerFactory.Constructor.call(LayerParams params) |
static Layer |
BlankLayer.create(LayerParams params) |
static Layer |
ChannelsPReLULayer.create(LayerParams params) |
static Layer |
CompareLayer.create(LayerParams params) |
static Layer |
ConstLayer.create(LayerParams params) |
static Layer |
CropAndResizeLayer.create(LayerParams params) |
static Layer |
CropLayer.create(LayerParams params) |
static Layer |
InterpLayer.create(LayerParams params) |
static Layer |
ShiftLayer.create(LayerParams params) |
static Layer |
ShiftLayerInt8.create(LayerParams params) |
static Layer |
ShuffleChannelLayer.create(LayerParams params) |
static Layer |
ShuffleChannelLayer.create(LayerParams params) |
static Layer |
ShiftLayerInt8.create(LayerParams params) |
static Layer |
ShiftLayer.create(LayerParams params) |
static Layer |
InterpLayer.create(LayerParams params) |
static Layer |
CropLayer.create(LayerParams params) |
static Layer |
CropAndResizeLayer.create(LayerParams params) |
static Layer |
ConstLayer.create(LayerParams params) |
static Layer |
CompareLayer.create(LayerParams params) |
static Layer |
ChannelsPReLULayer.create(LayerParams params) |
static Layer |
BlankLayer.create(LayerParams params) |
static Layer |
LayerFactory.createLayerInstance(BytePointer type,
LayerParams params)
\brief Creates instance of registered layer.
|
static Layer |
LayerFactory.createLayerInstance(BytePointer type,
LayerParams params)
\brief Creates instance of registered layer.
|
static Layer |
LayerFactory.createLayerInstance(String type,
LayerParams params) |
static Layer |
LayerFactory.createLayerInstance(String type,
LayerParams params) |
Layer |
Net.getLayer(BytePointer layerName)
Deprecated.
Use int getLayerId(const String &layer)
|
Layer |
Net.getLayer(BytePointer layerName)
Deprecated.
Use int getLayerId(const String &layer)
|
Layer |
Net.getLayer(DictValue layerId)
Deprecated.
to be removed
|
Layer |
Net.getLayer(DictValue layerId)
Deprecated.
to be removed
|
Layer |
Net.getLayer(int layerId)
\brief Returns pointer to layer with specified id or name which the network use.
|
Layer |
Net.getLayer(int layerId)
\brief Returns pointer to layer with specified id or name which the network use.
|
Layer |
Net.getLayer(String layerName) |
Layer |
Net.getLayer(String layerName) |
Layer |
Layer.getPointer(long i) |
Layer |
Layer.getPointer(long i) |
Layer |
Layer.name(BytePointer setter) |
Layer |
Layer.name(BytePointer setter) |
Layer |
Layer.position(long position) |
Layer |
Layer.position(long position) |
Layer |
Layer.preferableTarget(int setter) |
Layer |
Layer.preferableTarget(int setter) |
Layer |
Layer.type(BytePointer setter) |
Layer |
Layer.type(BytePointer setter) |
Modifier and Type | Method and Description |
---|---|
static Algorithm |
Layer.asAlgorithm(Layer pointer) |
static Algorithm |
Layer.asAlgorithm(Layer pointer) |
boolean |
Layer.tryFuse(Layer top)
\brief Try to fuse current layer with a next one
|
boolean |
Layer.tryFuse(Layer top)
\brief Try to fuse current layer with a next one
|
Copyright © 2024. All rights reserved.