Electrical and Electronic Engineering - Research Publications

Permanent URI for this collection

Search Results

Now showing 1 - 10 of 33
  • Item
    No Preview Available
    Zero-Error Feedback Capacity for Bounded Stabilization and Finite-State Additive Noise Channels
    Saberi, A ; Farokhi, F ; Nair, GN (IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 2022-10)
  • Item
    No Preview Available
    Bounded Estimation Over Finite-State Channels: Relating Topological Entropy and Zero-Error Capacity
    Saberi, A ; Farokhi, F ; Nair, GN (IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 2022-08)
  • Item
    Thumbnail Image
    Optimal contract design for effort-averse sensors
    Farokhi, F ; Shames, I ; Cantoni, M (Taylor & Francis, 2018-06-28)
    A central planner wishes to engage a collection of sensors to measure a quantity. Each sensor seeks to trade-off the effort it invests to obtain and report a measurement, against contracted reward. Assuming that measurement quality improves as a sensor increases the effort it invests, the problem of reward contract design is investigated. To this end, a game is formulated between the central planner and the sensors. Using this game, it is established that the central planner can enhance the quality of the estimate by rewarding each sensor based on the distance between the average of the received measurements and the measurement provided by the sensor. Optimal contracts are designed from the perspective of the budget required to achieve a specified level of error performance.
  • Item
    Thumbnail Image
    Ensuring privacy with constrained additive noise by minimizing Fisher information
    Farokhi, F ; Sandberg, H (PERGAMON-ELSEVIER SCIENCE LTD, 2019-01)
    The problem of preserving the privacy of individual entries of a database when responding to linear or nonlinear queries with constrained additive noise is considered. For privacy protection, the response to the query is systematically corrupted with an additive random noise whose support is a subset or equal to a pre-defined constraint set. A measure of privacy using the inverse of the trace of the Fisher information matrix is developed. The Cramér–Rao bound relates the variance of any estimator of the database entries to the introduced privacy measure. The probability density that minimizes the trace of the Fisher information (as a proxy for maximizing the measure of privacy) is computed. An extension to dynamic problems is also presented. Finally, the results are compared to the differential privacy methodology.
  • Item
    Thumbnail Image
    Optimal Stochastic Evasive Maneuvers Using the Schrodinger's Equation
    Farokhi, F ; Egerstedt, M (IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 2019-07)
    In this letter, preys with stochastic evasion policies are considered. The stochasticity adds unpredictable changes to the prey's path for avoiding predator's attacks. The prey's cost function is composed of two terms balancing the unpredictability factor (by using stochasticity to make the task of forecasting its future positions by the predator difficult) and energy consumption (the least amount of energy required for performing a maneuver). The optimal probability density functions of the actions of the prey for trading-off unpredictability and energy consumption is shown to be characterized by the stationary Schrödinger's equation.
  • Item
    Thumbnail Image
    Development and Analysis of Deterministic Privacy-Preserving Policies Using Non- Stochastic Information Theory
    Farokhi, F (IEEE, 2019-10)
    A deterministic privacy metric using non-stochastic information theory is developed. Particularly, maximin information is used to construct a measure of information leakage, which is inversely proportional to the measure of privacy. Anyone can submit a query to a trusted agent with access to a non-stochastic uncertain private dataset. Optimal deterministic privacy-preserving policies for responding to the submitted query are computed by maximizing the measure of privacy subject to a constraint on the worst-case quality of the response (i.e., the worst-case difference between the response by the agent and the output of the query computed on the private dataset). The optimal privacy-preserving policy is proved to be a piecewise constant function in the form of a quantization operator applied on the output of the submitted query. The measure of privacy is also used to analyze $k$ -anonymity (a popular deterministic mechanism for privacy-preserving release of datasets using suppression and generalization techniques), proving that it is in fact not privacy preserving.
  • Item
    Thumbnail Image
    Feedback control using a strategic sensor
    Farokhi, F (TAYLOR & FRANCIS LTD, 2021-01-02)
    A dynamic estimation and control problem with a strategic sensor is considered. The strategic sensor may provide corrupted messages about the state measurements of a discrete-time linear time-invariant dynamical system to the system operator (or the controller). The system operator then uses this information to construct an estimate of the state of the system (and perhaps private variables of the sensor). The estimate is used to control the system to achieve the operator's desired objective. The problem is formulated as a game, which might be conflicting to that of the strategic sensor. An equilibrium of the game is computed and its properties are investigated.
  • Item
    Thumbnail Image
    Structured preconditioning of conjugate gradients for path-graph network optimal control problems
    Zafar, A ; Cantoni, M ; Farokhi, F (IEEE, 2021-01-01)
    A structured preconditioned conjugate gradient (PCG) based linear system solver is developed for implementing Newton updates in second-order methods for a class of con- strained network optimal control problems. Of specific interest are problems with discrete-time dynamics arising from the path-graph interconnection of N heterogeneous sub-systems. The arithmetic complexity of each PCG step is O(NT), where T is the length of the time horizon. The proposed preconditioning involves a fixed number of block Jacobi iterations per PCG step. A decreasing analytic bound on the effective conditioning is given in terms of this number. The computations are decomposable across the spatial and temporal dimensions of the optimal control problem into sub-problems of size independent of N and T. Numerical results are provided for two example systems.
  • Item
    Thumbnail Image
    Do Auto-Regressive Models Protect Privacy? Inferring Fine-Grained Energy Consumption From Aggregated Model Parameters
    Sheikh, NU ; Asghar, HJ ; Farokhi, F ; Kaafar, MA (IEEE COMPUTER SOC, 2022-11-01)
    We investigate the extent to which statistical predictive models leak information about their training data. More specifically, based on the use case of household (electrical) energy consumption, we evaluate whether white-box access to auto-regressive (AR) models trained on such data together with background information, such as household energy data aggregates (e.g., monthly billing information) and publicly-available weather data, can lead to inferring fine-grained energy data of any particular household. We construct two adversarial models aiming to infer fine-grained energy consumption patterns. Both threat models use monthly billing information of target households. The second adversary has access to the AR model for a cluster of households containing the target household. Using two real-world energy datasets, we demonstrate that this adversary can apply maximum a posteriori estimation to reconstruct daily consumption of target households with significantly lower error than the first adversary, which serves as a baseline. Such fine-grained data can essentially expose private information, such as occupancy levels. Finally, we use differential privacy (DP) to alleviate the privacy concerns of the adversary in dis-aggregating energy data. Our evaluations show that differentially private model parameters offer strong privacy protection against the adversary with moderate utility, captured in terms of model fitness to the cluster.
  • Item
    Thumbnail Image
    Noiseless Privacy: Definition, Guarantees, and Applications
    Farokhi, F (Institute of Electrical and Electronics Engineers (IEEE), 2021)
    In this paper, we define noiseless privacy, as a nonstochastic rival to differential privacy, requiring that the outputs of a mechanism (i.e., function composition of a privacy-preserving mapping and a query) attain only a few values while varying the data of an individual (the logarithm of the number of the distinct values is bounded by the privacy budget). Therefore, the output of the mechanism is not fully informative of the data of the individuals in the dataset. We prove several guarantees for noiselessly-private mechanisms. The information content of the output about the data of an individual, even if an adversary knows all the other entries of the private dataset, is bounded by the privacy budget. The zero-error capacity of memory-less channels using noiselessly private mechanisms for transmission is upper bounded by the privacy budget. The performance of a non-stochastic hypothesis-testing adversary is bounded again by the privacy budget. Assuming that an adversary has access to a stochastic prior on the dataset, we prove that the estimation error of the adversary for individual entries of the dataset is lower bounded by a decreasing function of the privacy budget. In this case, we also show that the maximal leakage is bounded by the privacy budget. In addition to privacy guarantees, we prove that noiselessly-private mechanisms admit composition theorem and post-processing does not weaken their privacy guarantees. We prove that quantization or binning can ensure noiseless privacy if the number of quantization levels is appropriately selected based on the sensitivity of the query and the privacy budget. Finally, we illustrate the privacy merits of noiseless privacy using multiple datasets in energy, transport, and finance.