tvm.relay.analysis#
The Relay IR namespace containing the analysis passes.
Classes:
|
Class to represent a relay expression split into regions. |
|
Class to represent a call graph. |
|
The features a program might contain. |
Functions:
|
Collect set of all data types used in expr. |
|
Get all type variables from expression/type e |
|
Get all vars from expression expr in post-DFS order. |
|
Get bound type variables from expression/type e |
|
Get bound vars from expression expr in post-DFS order. |
Check whether an expression is in the basic block form |
|
|
Check whether an expression is constant |
|
Check that the type is well kinded and return the kind. |
|
Determine the number of layers of specified ops in a graph. |
|
Construct a CPU device |
|
Detect the feature used in a relay program. |
Pass to extract IRModule of only fused primitive functions. |
|
|
Extract Relay Expr by its expression ID |
|
Get free type variables from expression/type e |
|
Get free Vars from expression expr in Post DFS order. |
|
Get the calibration data of a given relay graph |
|
Count the number of MACs (multiply-accumulate) of a model |
Pass to extract fake quantized op names and the frequency that they appear in fake quantized regions of an IRModule. |
|
|
Pass to extract unique operator names and how frequently they appear in an IRModule. |
|
Recursively visit the ir in post DFS order node, apply fvisit. |
|
Search fc weight name in the patten: y = nn.dense(x, transpose(w, [1, 0])) |
|
Finds cases that the match expression does not catch, if any. |
|
Check that each Var is only bound once (well formed). |
- class tvm.relay.analysis.AnnotatedRegionSet(expr, region_begin_op, region_end_op)[源代码]#
Class to represent a relay expression split into regions.
Methods:
__init__
(expr, region_begin_op, region_end_op)Construct regions from an expression.
get_region
(expr)Get the region an expression belongs to.
- class tvm.relay.analysis.CallGraph(module)[源代码]#
Class to represent a call graph.
Methods:
__init__
(module)Construct a call graph.
__str__
()Print the call graph in the topological order.
_get_global_var
(var)Return the global var using a given name or GlobalVar.
global_call_count
(var)Return the number of global function calls from a given global var.
is_recursive
(var)Return if the function corresponding to a var is a recursive function.
print_var
(var)Print a call graph of a global function by name or by variable.
ref_count
(var)Return the number of references to the global var
Attributes:
Return the contained Relay IR module.
- __init__(module)[源代码]#
Construct a call graph.
Parameters#
- moduletvm.ir.IRModule
The IR module used to create a call graph
Returns#
- call_graph: CallGraph
A constructed call graph.
- _get_global_var(var)[源代码]#
Return the global var using a given name or GlobalVar.
Parameters#
var : Union[String, tvm.relay.GlobalVar]
Returns#
- rettvm.relay.GlobalVar
The global var.
- global_call_count(var)[源代码]#
Return the number of global function calls from a given global var.
Parameters#
var : Union[String, tvm.relay.GlobalVar]
Returns#
- retint
The number of global function calls from the given var.
- is_recursive(var)[源代码]#
Return if the function corresponding to a var is a recursive function.
Parameters#
var : Union[String, tvm.relay.GlobalVar]
Returns#
- retBoolean
If the function corresponding to var is recurisve.
- print_var(var)[源代码]#
Print a call graph of a global function by name or by variable.
Parameters#
- var: Union[String, tvm.relay.GlobalVar]
The name or global variable.
Returns#
- retString
The call graph represented in string.
- class tvm.relay.analysis.Feature(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None)[源代码]#
The features a program might contain.
Attributes:
Whether there is local fixpoint in the program.
Whether any non-atom fragment of the program is shared, making the program a graph.
- fGraph = 15#
Whether there is local fixpoint in the program.
- fMatch = 14#
Whether any non-atom fragment of the program is shared, making the program a graph.
- tvm.relay.analysis.all_dtypes(expr)[源代码]#
Collect set of all data types used in expr.
Parameters#
- exprtvm.relay.Expr
The input expression
Returns#
- retSet[String]
Set of data types used in the expression (e.g., {‘int8’, ‘int32’})
- tvm.relay.analysis.all_type_vars(expr, mod=None)[源代码]#
Get all type variables from expression/type e
Parameters#
- exprUnion[tvm.relay.Expr,tvm.relay.Type]
The input expression/type
- modOptional[tvm.IRModule]
The global module
Returns#
- freeList[tvm.relay.TypeVar]
The list of all type variables in post-DFS order
- tvm.relay.analysis.all_vars(expr)[源代码]#
Get all vars from expression expr in post-DFS order.
Parameters#
- exprtvm.relay.Expr
The input expression
Returns#
- freeList[tvm.relay.Var]
The list of all variables in post-DFS order.
- tvm.relay.analysis.bound_type_vars(expr, mod=None)[源代码]#
Get bound type variables from expression/type e
Parameters#
- exprUnion[tvm.relay.Expr,tvm.relay.Type]
The input expression/type
- modOptional[tvm.IRModule]
The global module
Returns#
- freeList[tvm.relay.TypeVar]
The list of bound type variables in post-DFS order
- tvm.relay.analysis.bound_vars(expr)[源代码]#
Get bound vars from expression expr in post-DFS order.
Parameters#
- exprtvm.relay.Expr
The input expression
Returns#
- freeList[tvm.relay.Var]
The list of bound variables in post-DFS order.
- tvm.relay.analysis.check_basic_block_normal_form(expr)[源代码]#
Check whether an expression is in the basic block form
Parameters#
- exprtvm.relay.Expr
The input expression
Returns#
- resultbool
Whether the expression is in the basic block form.
- tvm.relay.analysis.check_constant(expr)[源代码]#
Check whether an expression is constant
Parameters#
- exprtvm.relay.Expr
The input expression
Returns#
- resultbool
Whether the expression is constant.
- tvm.relay.analysis.check_kind(t, mod=None)[源代码]#
Check that the type is well kinded and return the kind. For example, this mean type cannot has tensor of tensor, or is a tuple type of 2 shapes.
Parameters#
- ttvm.relay.Type
The type to check
- modOptional[tvm.IRModule]
The global module.
Returns#
- kindKind
the kind of t
Examples#
assert check_kind(relay.TupleType([relay.TypeParam('tp1', relay.Kind.Shape)])) == Shape assert check_kind(relay.TupleType([relay.TypeParam('tp1', relay.Kind.Type)])) == Type
- tvm.relay.analysis.count_layers(expr, valid_ops)[源代码]#
Determine the number of layers of specified ops in a graph. This pass computes only the deepest chain of ops rather than the total number of ops in a graph. Thus, if there are two parallel convolutions (for example), they would be considered a single layer.
Parameters#
- exprtvm.relay.Expr, tvm.relay.Function, or tvm.ir.IRModule.
The input expression.
- valid_ops: List[str]
A list of the operations that should be included in the count.
Returns#
- layer_countint
The number of layers of the specified operations found in the graph.
- tvm.relay.analysis.cpu(dev_id=0)[源代码]#
Construct a CPU device
Parameters#
- dev_idint, optional
The integer device id
Returns#
- devDevice
The created device
- tvm.relay.analysis.detect_feature(a, b=None)[源代码]#
Detect the feature used in a relay program.
Parameters#
- aUnion[tvm.relay.Expr, tvm.IRModule]
The input expression or module.
- bOptional[Union[tvm.relay.Expr, tvm.IRModule]]
The input expression or module. The two arguments cannot both be expression or module.
Returns#
- featuresSet[Feature]
Features used in the program.
- tvm.relay.analysis.extract_fused_functions(mod)[源代码]#
Pass to extract IRModule of only fused primitive functions.
The ExtractFusedFunctions pass invokes SimplifyInference, FuseOps(3), and ExtractFusedFunctions in that order
Parameters#
mod : tvm.IRModule
Returns#
- retDict[int, tvm.relay.function.Function]
A module containing only fused primitive functions
- tvm.relay.analysis.extract_intermdeiate_expr(mod, expr_id)[源代码]#
Extract Relay Expr by its expression ID
This function is used for extracting Relay Expr by its expression ID of the main function that we can see in print(mod[“main”]).
Parameters#
mod : tvm.IRModule
expr_id : the Expr ID that we want to extract
Returns#
ret : Extracted IRModule
Examples#
# Suppose our module is printed like this: # def @main(%x: Tensor[(1, 1, 5, 1), float32], %w1, %w2) { # %0 = nn.conv2d(%x, %w1, padding=[1, 1, 1, 1], channels=1, kernel_size=[3, 3]); # %1 = nn.conv2d(%0, %w2, padding=[1, 1, 1, 1], channels=1, kernel_size=[3, 3]); # %2 = add(%0, %1); # %3 = split(%2, indices_or_sections=1); # %4 = %3.0; # add(%4, 1f) # } # if we want to extract `%1 = nn.conv2d` from tvm import relay relay.analysis.extract_intermdeiate_expr(mod, 1)
- tvm.relay.analysis.free_type_vars(expr, mod=None)[源代码]#
Get free type variables from expression/type e
Parameters#
- exprUnion[tvm.relay.Expr,tvm.relay.Type]
The input expression/type
- modOptional[tvm.IRModule]
The global module
Returns#
- freeList[tvm.relay.TypeVar]
The list of free type variables in post-DFS order
- tvm.relay.analysis.free_vars(expr)[源代码]#
Get free Vars from expression expr in Post DFS order.
Parameters#
- exprtvm.relay.Expr
The input expression
Returns#
- freeList[tvm.relay.Var]
The list of free variables in post DFS order.
Note#
The fact that Vars are post-DFS ordred are useful in neural networks: usually this means weights of previous are ordered first.
- tvm.relay.analysis.get_calibration_data(mod, data)[源代码]#
Get the calibration data of a given relay graph
This pass uses the graph executor to get the calibration data of a module, which includes the input and output values of each function. The returned data uses the GlobalVar of each function as a key. Users can further access the inputs and outputs by using inputs or outputs as the key.
Following are some limitations: 1. The input module (graph) cannot have control flows. 2. The input arguments of each function cannot be tuples (outputs can be tuples). 3. We only handle top-level functions (i.e., nested function is not handled). 4. We only handle functions with Compiler attribute being set.
Parameters#
- modtvm.IRModule
The input module for collecting the calibration data
- dataDict[str, NDArray]
The input data for running the module
Returns#
data : Dict[tvm.relay.GlobalVar, Dict[str, NDArray]]
- tvm.relay.analysis.get_total_mac_number(expr)[源代码]#
Count the number of MACs (multiply-accumulate) of a model
Parameters#
- exprtvm.relay.Expr
The input expression.
Returns#
- resultint64
The number of MACs (multiply-accumulate) of a model
- tvm.relay.analysis.list_fake_quantized_op_freqs(mod)[源代码]#
Pass to extract fake quantized op names and the frequency that they appear in fake quantized regions of an IRModule.
Parameters#
mod : tvm.IRModule
Returns#
- retDict[str, int]
Dict of fake quantized operator names to frequency
- tvm.relay.analysis.list_op_freqs(mod)[源代码]#
Pass to extract unique operator names and how frequently they appear in an IRModule. Fused functions are traversed to count the operators that compose them.
Parameters#
mod : tvm.IRModule
Returns#
- retDict[str, int]
Dict of unique operator names to frequency
- tvm.relay.analysis.post_order_visit(expr, fvisit)[源代码]#
Recursively visit the ir in post DFS order node, apply fvisit. Each node is guaranteed to be visited only once.
Parameters#
- exprtvm.relay.Expr
The input expression.
- fvisitfunction
The visitor function to be applied.
- tvm.relay.analysis.search_fc_transpose(expr)[源代码]#
Search fc weight name in the patten: y = nn.dense(x, transpose(w, [1, 0]))
This function is used in the data_dep_optimization.simplify_fc_transpose method
Parameters#
expr : tvm.relay.Expr
Returns#
- retArray[String]
Array of weight variable name in pattern y = nn.dense(x, transpose(w, [1, 0]))
- tvm.relay.analysis.unmatched_cases(match, mod=None)[源代码]#
Finds cases that the match expression does not catch, if any.
Parameters#
- matchtvm.relay.Match
The match expression
- modOptional[tvm.IRModule]
The module (defaults to an empty module)
Returns#
- missing_patterns[tvm.relay.Pattern]
Patterns that the match expression does not catch.