Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/advanced_security/public_commitments.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ You can also implement such a pattern using Halo2's `Fixed` columns _if the priv

The annoyance in using `Fixed` columns comes from the fact that they require generating a new verifying key every time a new set of commitments is generated.

> **Example:** Say for instance an application leverages a zero-knowledge circuit to prove the correct execution of a neural network. Every week the neural network is finetuned or retrained on new data. If the architecture remains the same then commiting to the new network parameters, along with a new proof of performance on a test set, would be an ideal setup. If we leverage `Fixed` columns to commit to the model parameters, each new commitment will require re-generating a verifying key and sharing the new key with the verifier(s). This is not-ideal UX and can become expensive if the verifier is deployed on-chain.
> **Example:** Say for instance an application leverages a zero-knowledge circuit to prove the correct execution of a neural network. Every week the neural network is finetuned or retrained on new data. If the architecture remains the same then committing to the new network parameters, along with a new proof of performance on a test set, would be an ideal setup. If we leverage `Fixed` columns to commit to the model parameters, each new commitment will require re-generating a verifying key and sharing the new key with the verifier(s). This is not-ideal UX and can become expensive if the verifier is deployed on-chain.

An ideal commitment would thus have the low cost of a `Fixed` column but wouldn't require regenerating a new verifying key for each new commitment.

Expand Down
2 changes: 1 addition & 1 deletion examples/mlp_4d_einsum.rs
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ impl<const LEN: usize, const LOOKUP_MIN: IntegerRep, const LOOKUP_MAX: IntegerRe
.configure_range_check(cs, &input, &params, (0, 1023), K)
.unwrap();

// sets up a new ReLU table and resuses it for l1 and l3 non linearities
// sets up a new ReLU table and reuses it for l1 and l3 non linearities
layer_config
.configure_lookup(
cs,
Expand Down
4 changes: 2 additions & 2 deletions ezkl.pyi
Original file line number Diff line number Diff line change
Expand Up @@ -484,7 +484,7 @@ def ipa_commit(message:typing.Sequence[str],vk_path:str | os.PathLike | pathlib.
Arguments
-------
message: list[str]
List of field elements represnted as strings
List of field elements represented as strings

vk_path: str
Path to the verification key
Expand All @@ -508,7 +508,7 @@ def kzg_commit(message:typing.Sequence[str],vk_path:str | os.PathLike | pathlib.
Arguments
-------
message: list[str]
List of field elements represnted as strings
List of field elements represented as strings

vk_path: str
Path to the verification key
Expand Down
2 changes: 1 addition & 1 deletion src/bindings/universal.rs
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ pub fn poseidon_hash_no_felt(message: Vec<u8>) -> Result<Vec<u8>, EZKLError> {
/// Encode verifier calldata from proof and ethereum vk_address
#[cfg_attr(feature = "ios-bindings", uniffi::export)]
pub fn encode_verifier_calldata(
// TODO - shuold it be pub or pub or pub(super)?
// TODO - should it be pub or pub or pub(super)?
proof: Vec<u8>,
vka: Option<Vec<u8>>,
) -> Result<Vec<u8>, EZKLError> {
Expand Down
2 changes: 1 addition & 1 deletion src/circuit/ops/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ pub trait Op<F: PrimeField + TensorType + PartialOrd + std::hash::Hash>:
/// Returns the scale of the output of the operation.
fn out_scale(&self, _: Vec<crate::Scale>) -> Result<crate::Scale, CircuitError>;

/// Do any of the inputs to this op require homogenous input scales?
/// Do any of the inputs to this op require homogeneous input scales?
fn requires_homogenous_input_scales(&self) -> Vec<usize> {
vec![]
}
Expand Down
2 changes: 1 addition & 1 deletion src/commands.rs
Original file line number Diff line number Diff line change
Expand Up @@ -546,7 +546,7 @@ pub enum Commands {
/// The path to output the desired srs file, if set to None will save to ~/.ezkl/srs
#[arg(long, default_value = None, value_hint = clap::ValueHint::FilePath)]
srs_path: Option<PathBuf>,
/// Path to the circuit settings .json file to read in logrows from. Overriden by logrows if specified.
/// Path to the circuit settings .json file to read in logrows from. Overridden by logrows if specified.
#[arg(short = 'S', long, default_value = DEFAULT_SETTINGS, value_hint = clap::ValueHint::FilePath)]
settings_path: Option<PathBuf>,
/// Number of logrows to use for srs. Overrides settings_path if specified.
Expand Down
4 changes: 2 additions & 2 deletions src/graph/errors.rs
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ pub enum GraphError {
#[error("a node is missing required params: {0}")]
MissingParams(String),
/// A node has missing parameters
#[error("a node has misformed params: {0}")]
#[error("a node has malformed params: {0}")]
MisformedParams(String),
/// Error in the configuration of the visibility of variables
#[error("there should be at least one set of public variables")]
Expand All @@ -51,7 +51,7 @@ pub enum GraphError {
#[error("[io] ({0}) {1}")]
ReadWriteFileError(String, String),
/// Model serialization error
#[error("failed to ser/deser model: {0}")]
#[error("failed to serialize/deserialize model: {0}")]
ModelSerialize(#[from] bincode::Error),
/// Tract error
#[cfg(all(
Expand Down
2 changes: 1 addition & 1 deletion src/graph/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -1227,7 +1227,7 @@ impl GraphCircuit {
let reader = std::io::BufReader::with_capacity(*EZKL_BUF_CAPACITY, f);
let result: GraphCircuit = bincode::deserialize_from(reader)?;

// check the versions matche
// check that the versions match
crate::check_version_string_matches(&result.core.settings.version);

Ok(result)
Expand Down
4 changes: 2 additions & 2 deletions src/pfsys/evm/aggregation_kzg.rs
Original file line number Diff line number Diff line change
Expand Up @@ -323,9 +323,9 @@ impl AggregationCircuit {
}

/// Number of instance variables for the aggregation circuit, used in generating verifier.
pub fn num_instance(orginal_circuit_instances: usize) -> Vec<usize> {
pub fn num_instance(original_circuit_instances: usize) -> Vec<usize> {
let accumulation_instances = 4 * LIMBS;
vec![accumulation_instances + orginal_circuit_instances]
vec![accumulation_instances + original_circuit_instances]
}

/// Instance variables for the aggregation circuit, fed to verifier.
Expand Down
2 changes: 1 addition & 1 deletion src/tensor/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
pub mod errors;
/// Implementations of common operations on tensors.
pub mod ops;
/// A wrapper around a tensor of circuit variables / advices.
/// A wrapper around a tensor of circuit variables / advice values.
pub mod val;
/// A wrapper around a tensor of Halo2 Value types.
pub mod var;
Expand Down
4 changes: 2 additions & 2 deletions src/tensor/ops.rs
Original file line number Diff line number Diff line change
Expand Up @@ -241,9 +241,9 @@ pub fn trilu<T: TensorType + std::marker::Send + std::marker::Sync>(
// The lower triangular part consists of elements on and below the diagonal. All other elements in the matrix are set to zero.

let batch_dims = &a.dims()[0..a.dims().len() - 2];
let batch_cartiesian = batch_dims.iter().map(|d| 0..*d).multi_cartesian_product();
let batch_cartesian = batch_dims.iter().map(|d| 0..*d).multi_cartesian_product();

for b in batch_cartiesian {
for b in batch_cartesian {
for i in 0..a.dims()[1] {
for j in 0..a.dims()[2] {
let mut coord = b.clone();
Expand Down