LsrConfig¶
-
class
LsrConfig
(finetune_emb: bool = False, word_embedding_shape: Tuple[int, int] = (194784, 100), ner_dim: int = 20, coref_dim: int = 20, hidden_dim: int = 120, distance_size: int = 20, num_relations: int = 97, dropout_rate: float = 0.3, dropout_emb: float = 0.2, dropout_gcn: float = 0.4, use_struct_att: bool = False, use_reasoning_block: bool = True, reasoner_layer_sizes: Tuple[int, int] = (3, 4), max_length: int = 512, use_bert: bool = False, initializer_range: float = 0.02, **kwargs)[source]¶ This is the configuration class to store the configuration of a
LsrModel
. It is used to instantiate a relation extraction model using latent structure refinement (LSR) according to the specified arguments, defining the model architecture.Configuration objects inherit from
PretrainedConfig
and can be used to control the model outputs. Read the documentation fromPretrainedConfig
for more information.- Parameters
finetune_emb (
bool
, optional, defaults toFalse
) – Whether to finetune word embedding.word_embedding_shape (
Tuple[int, int]
, optional, defaults to (194784, 100)) – Dimensionality of word embedding.ner_dim (
int
, optional, defaults to 20) – Dimensionality of NER embedding.coref_dim (
int
, optional, defaults to 20) – Dimensionality of coreference embedding.hidden_dim (
int
, optional, defaults to 120) – Dimensionality of hidden states.distance_size (
int
, optional, defaults to 20) – Dimensionality of distance embedding.num_relations (
int
, optional, defaults to 97) – Number of classes for relations.dropout_rate (
float
, optional, defaults to 0.3) – Dropout rate for encoding layer.dropout_emb (
float
, optional, defaults to 0.2) – Dropout rate for embedding layer.dropout_gcn (
float
, optional, defaults to 0.4) – Dropout rate for graph convolution networkuse_struct_att (
bool
, optional, defaults toFalse
) – Whether to use struct attention.use_reasoning_block (
bool
, optional, defaults toTrue
) – Whether to use reasoning block.reasoner_layer_sizes (
Tuple[int, int]
, optional, defaults to (3, 4)) – Number of layers in reasoning block.max_length (
int
, optional, defaults to 512) – Max length of tokens considered in document.use_bert (
bool
, optional, defaults toFalse
) – Whether to use bert as encoder layer.initializer_range (
float
, optional, defaults to0.02
) – Initializer range for weights.
Example:
from sgnlp.models.lsr import LsrConfig # Initialize with default values configuration = LsrConfig()