Bandwidth
The maximum rate of data transfer across a network path, measured in bits per second (bps, Mbps, Gbps). Bandwidth is the capacity of the pipe, while throughput is the actual achieved rate accounting for protocol overhead, retransmissions, and congestion. High bandwidth does not guarantee low latency — a satellite link may have 50 Mbps bandwidth but 600 ms of latency. API designers optimize for bandwidth by compressing payloads and using binary protocols like gRPC.
関連プロトコル
ガイドで言及されています
HTTP Caching: Cache-Control, ETags, and 304 Responses
HTTP Fundamentals
HTTP Conditional Requests: If-Match, If-None-Match, and Beyond
HTTP Fundamentals
GraphQL vs REST: When to Use Which
API Design & Best Practices
Subresource Integrity (SRI): Protecting CDN-Hosted Scripts
Security & Authentication
TCP Three-Way Handshake and Connection Lifecycle
DNS & Networking
WebRTC Signaling with SIP and WebSocket
Real-Time Protocols
gRPC Streaming vs REST Polling: Real-Time Data Delivery
Real-Time Protocols
MQTT Protocol: Lightweight Messaging for IoT and Real-Time Systems
Real-Time Protocols
HTTP Caching Strategy for APIs
Performance & Optimization
HTTP Compression: gzip vs Brotli vs zstd
Performance & Optimization
WebSocket Performance Tuning
Performance & Optimization
HTTP/2 Server Push: Promise and Pitfalls
Performance & Optimization
Taming Tail Latency: Why P99 Matters More Than Average
Performance & Optimization
RFC 7540: HTTP/2 Protocol Deep Dive
Protocol Deep Dives
RFC 7231/9110: HTTP Method Semantics — Safe, Idempotent, and Cacheable
Protocol Deep Dives
QUIC Protocol: The Transport Layer Behind HTTP/3
Protocol Deep Dives
Migrating from HTTP/1.1 to HTTP/2
Migration & Upgrades
Rate Limiting at the Edge: WAF, CDN, and API Gateway Strategies
Production Infrastructure
Chaos Engineering for APIs: Injecting Faults and Status Codes
Testing & Mocking