Dear bertie,
i fear yr query may be unanswerable without some specific objectives, eg the BPC variables. if micro. is included, this usually limits you to things like the nmMc patterns for realistic workload factors.
The usual problem with MIL STD, and other similar systems, is unrealistic sample sizes unless you are willing to make some substantial AQL compromises (eg the "special" sampling schemes portion, S1,S2 etc from memory) or the measurement is readily automated, eg weight. Nonetheless it has been used.
For variables and attributes, one relatively quick, dirty, route is to do a little trial experimentation via Mr Gauss. First get an approx std.dev, or coefficient of variation, then use the customary non-complex equations to get an idea of sample size feasibility vs desired accuracy. Ideally this implies a sample size of ca.30 as a minimum but if you study a little, can see a considerable reduction is possible in many cases.
IMEX, sampling is just as much an art (ie knowing how to approximate) as a science. Many company schemes are a mixture of both, usually heavy on the "art" side. 
"Kramer and Twigg" is one of the general food classics from memory. Various ICMSF volumes are the standard for micro. attributes.
And yes, many companies crudely segment as per perceived risk status also. Often High / non-High IMEX.
Rgds / Charles.C