Optimizing Commonality and Performance in Platform-Based Earth - - PowerPoint PPT Presentation

optimizing commonality and performance in platform based
SMART_READER_LITE
LIVE PREVIEW

Optimizing Commonality and Performance in Platform-Based Earth - - PowerPoint PPT Presentation

Introduction Methodology Example of Application Conclusion Optimizing Commonality and Performance in Platform-Based Earth Observing SmallSat Architectures Zvonimir Stojanovski Daniel Selva March 10, 2017 Partially funded by the Cornell


slide-1
SLIDE 1

1/24

Introduction Methodology Example of Application Conclusion

Optimizing Commonality and Performance in Platform-Based Earth Observing SmallSat Architectures

Zvonimir Stojanovski Daniel Selva March 10, 2017

Partially funded by the Cornell University Engineering Learning Initiatives

slide-2
SLIDE 2

2/24

Introduction Methodology Example of Application Conclusion

Background

CubeSats and other small satellites are becoming important in Earth-observing systems [1] Often large constellations of similar or identical satellites Can we use commonality to reduce mission costs without sacrificing performance?

Images from NASA.gov

slide-3
SLIDE 3

3/24

Introduction Methodology Example of Application Conclusion

Commercial Off-the-Shelf (COTS) Components

Increasingly used for small satellites Can significantly reduce development cost Automated tool developed by Jacobs and Selva [2]

Equipped with catalog of COTS components Generates and evaluates design for a CubeSat

We want to extend this concept to families of satellites

endurosat.com

slide-4
SLIDE 4

4/24

Introduction Methodology Example of Application Conclusion

Platform-Based Design

Widely used in mature industries, e.g., automotive and aircraft [3] Scale-Based Variants obtained by scaling variables such as length or area E.g., Airbus A3xx family Modular – Used Here Variants obtained by combining different sets of common components E.g., Volkswagen A family More appropriate for using COTS components

“Airbus A320 Family,” Global Traffic.

slide-5
SLIDE 5

5/24

Introduction Methodology Example of Application Conclusion

The Optimization Problem

Maximize Performance and Minimize Cost subject to Feasibility Constraints Cost model accounts for commonality and modularity When commonality is used:

Cost is typically lower but Design is not tailored specifically to each mission

This is the main trade-off in this problem

slide-6
SLIDE 6

6/24

Introduction Methodology Example of Application Conclusion

How the Tool Works

Genetic Algorithm (NSGA II) Catalog Results Select Module Scheme Requirements Compute Cost Feasible? Apply Penalty Compute Performance

Components Selected for Each Mission No Yes Evaluated Platform Design

slide-7
SLIDE 7

7/24

Introduction Methodology Example of Application Conclusion

Representation of a Satellite Design

Each satellite must have certain components We call these abstract components component slots

Some slots may be empty (e.g., ADCS Actuator 2 and Propulsion) Components may be redundant

To design a satellite means to select components from the catalog to fill the component slots.

slide-8
SLIDE 8

8/24

Introduction Methodology Example of Application Conclusion

Modules and Platforms

A module is a set of one or more components that is assembled prior to the main assembly of the spacecraft A module may fill multiple component slots at once Modules are assembled from components from the catalog The module scheme indicates which component slots are placed together in modules A platform is a family of spacecraft with shared modules In a platform, all missions use the same module scheme

slide-9
SLIDE 9

9/24

Introduction Methodology Example of Application Conclusion

Cost Model for Mission Platform

Total cost: C = CP +CIAT +CL (1) Component Cost CP: Sum of retail costs of COTS components Launch Cost CL: based on prices given by a launch provider Integration, Assembly and Testing (IAT) Cost CIAT: affected by modular design and commonality Assume other costs are not affected by choice of components

  • r modules
slide-10
SLIDE 10

10/24

Introduction Methodology Example of Application Conclusion

IAT Cost of Modules

Model based on Tsai, Chen, and Lo [4] Aj = γj

  • i

mijaij (2)

Aj: IAT cost of module j aij: non-modular IAT cost of component i mij number of component i in module j γj: “the savings ratio when module j is used”

Learning curve is used for multiple identical modules

Then Aj is the first unit IAT cost of module j

Small γj is better—gives lower module IAT cost

slide-11
SLIDE 11

11/24

Introduction Methodology Example of Application Conclusion

Connectivity Coefficients

For each pair of components {i,k}, we define a connectivity coefficient ǫik

Note that ǫik = ǫki

This is the cost increase or decrease factor when i and k are placed together in a module We compute γj by averaging ǫik over all pairs of components in module j Sample Module j:

ǫ Antenna Transceiver Battery Antenna \ Sym. Transceiver 0.8 \ Battery 1.0 0.9 \

= ⇒

γj = 0.9

slide-12
SLIDE 12

12/24

Introduction Methodology Example of Application Conclusion

Heuristic for Selecting Module Schemes

Computationally expensive (if not impossible) to evaluate all module schemes

115 975 modules schemes for 10 component slots 1 382 958 545 for 15 slots

Instead, we use a heuristic based on graph theory Groups components together based on two factors:

Frequently occurring pairs (take advantage of learning factor) Low connectivity coefficients (lower first-time IAT cost)

slide-13
SLIDE 13

13/24

Introduction Methodology Example of Application Conclusion

Procedure for Determining Module Schemes

(A part of) the initial graph

3 5 2 4 4 4 3.3 2.7 5 5 5 5 3 4 2.7

Propulsion Antenna Battery Transceiver ADCS Actuator 2 ADCS Actuator 1 Weight: wij = ǫij ×(# distinct component pairs)

slide-14
SLIDE 14

14/24

Introduction Methodology Example of Application Conclusion

Procedure for Determining Module Schemes

Edges are removed from heaviest to lightest Graph splits into n connected components, for n = 1,2,3,... This is for n = 3: Three Groups

2 2.7 2.7

Propulsion Antenna Battery Transceiver ADCS Actuator 2 ADCS Actuator 1

slide-15
SLIDE 15

15/24

Introduction Methodology Example of Application Conclusion

Evaluating Performance for Mission Platforms

Threshold and Target values given for performance metrics for each mission, e.g.

Lifetime Downlink data rate Slew rate Pointing accuracy

Each performance metric is normalized using a sigmoid function Performance of a mission is the average of its normalized performance metrics Platform Performance Score – to be maximized Weighted average of missions’ performance Weight used is the mission’s “importance”

slide-16
SLIDE 16

16/24

Introduction Methodology Example of Application Conclusion

Problem Overview

Inputs – For Each Mission Payload Orbit Threshold and Target values for performance Number of satellites “Importance” number Feasibility constraints

Basic requirements for

  • perational satellite

E.g., solar panels produce sufficient power

Ouptuts Modular design for mission family Total cost and cost breakdown (by mission, components, IAT, and launch) Performance metrics for missions

slide-17
SLIDE 17

17/24

Introduction Methodology Example of Application Conclusion

Sample Missions – Used for Testing the Tool

Mission #Sats Importance Orbit A 20 5 LEO, 400 km, Polar B 16 6 LEO, 600 km, Near-Polar C 8 8 SSO, 600 km, Morning D 5 10 SSO, 600 km, Afternoon E 15 10 LEO, 800 km, Polar F 5 15 SSO, 600 km, Morning

slide-18
SLIDE 18

18/24

Introduction Methodology Example of Application Conclusion

Payload Specifications for Sample Missions

Mission Mass (g) Power (W) Height (mm) Data Rate (KB/s) A 1000 3 100 1.0 B 200 1 30 2.5 C 2000 20 150 12.5 D 3000 30 200 25.0 E 1200 10 100 5.0 F 1500 15 150 50.0 For all sample payloads: One-year reliability is 99.9% Length and width are 100 mm

slide-19
SLIDE 19

19/24

Introduction Methodology Example of Application Conclusion

Sample Mission Requirements

Mission A B C D E F Lifetime (years) Thr. 0.4 0.4 1.5 4 1.8 2 Tar. 0.5 0.5 2 5 2.2 2.5 Pointing Accuracy (deg) Thr. 1 2 0.2 0.005 0.5 0.005 Tar. 0.5 1 0.1 0.001 0.1 0.001 Downlink Data Rate (kbit/s) Thr. 72 160 800 1600 320 3200 Tar. 80 200 1000 2000 400 4000 Slew Rate (sec. to slew 30◦) Thr. 150 — 90 45 90 120 Tar. 120 — 60 30 60 75

  • Thr. — Threshold value
  • Tar. — Target value
slide-20
SLIDE 20

20/24

Introduction Methodology Example of Application Conclusion

Plot of All Feasible Designs Found

0.72 0.74 0.76 0.78 0.8 0.82 0.84 0.86 0.88 30 35 40 45 50 55 60 Performance Score Cost ($1 000 000) Dominated Non-Dominated

slide-21
SLIDE 21

21/24

Introduction Methodology Example of Application Conclusion

Sample Module Schemes

Illustrate the trade-off between commonality and performance Highest-Performance Platform Lowest-Cost Platform

Group Component Slots V 1 ADCS Sensor 4 2 ADCS Actuator 1 2 3 ADCS Actuator 2 2 4 OBC 4 5 Battery 6 6 Antenna Transceiver 3 7 Solar Panel 4 8 Structure 2 9 Propulsion 1 Group Component Slots V 1 ADCS Sensor 3 2 OBC 5 3 Battery 4 4 Antenna Transceiver 3 5 ADCS Actuator 1 ADCS Actuator 2 Propulsion 2 6 Structure Solar Panel 3

V is the number of variants of each module.

slide-22
SLIDE 22

22/24

Introduction Methodology Example of Application Conclusion

Limitations

Performance model uses rough approximations Component catalog is small Cost model does not account for ground operations, etc. Connectivity coefficients are guesses Emphasis was on methodology

slide-23
SLIDE 23

23/24

Introduction Methodology Example of Application Conclusion

Future Work

Expand component catalog and performance and cost models Determine connectivity coefficients more accurately Investigate results using more advanced genetic algorithms, e.g., with adaptive operator selection Evaluate the heuristic used for finding module schemes

Approach 1: Mathematical proof (if possible) Approach 2: Experimental – For some concrete examples produced by our tool, generate all possible module schemes, select the one with the lowest cost, and compare it with the module scheme selected by our heuristic

slide-24
SLIDE 24

24/24

Introduction Methodology Example of Application Conclusion

References

  • D. Selva and D. Krejci, “A survey and assessment of the capabilities of

Cubesats for Earth observation,” 5 2012.

  • M. Jacobs and D. Selva, “A CubeSat Catalog Design Tool for a Multi-Agent

Architecture Development Framework,” Aerospace Conference, 2015 IEEE,

  • pp. 1–10, 2015.
  • T. W. Simpson, “Product platform design and customization: Status and

promise,” Ai Edam, vol. 18, no. 01, pp. 3–20, 2004.

  • C. Y. Tsai, C. J. Chen, and Y. T. Lo, “A cost-based module mining method for

the assemble-to-order strategy,” Journal of Intelligent Manufacturing, vol. 25,

  • pp. 1377–1392, 11 2014.